Archive 5 Archive 6 Archive 7 Archive 8 Archive 9 Archive 10 Archive 15

RM bot inactive

This bot, which maintains the list of requested moves, suddenly stopped working after 17:30, 18 July 2012‎. I'm asking this group if anybody knows how to kick start the bot. I recall the bot has been stopped before: Wikipedia:Bot owners' noticeboard/Archive 7#RM bot inactiveWbm1058 (talk) 00:16, 20 July 2012 (UTC)

Discussion at Wikipedia:Administrators' noticeboard/IncidentArchive761#RM bot inactive. Wbm1058 (talk) 13:35, 20 July 2012 (UTC)

...and at Wikipedia:Village pump (technical)#RM bot inactiveWbm1058 (talk) 12:44, 23 July 2012 (UTC)

HBC Archive Indexerbot needs a new home

Howdy all! Very long time no-really-do-anything-with-Wikipedia! For those (everyone? :)) who has no idea who I am, I "operate" HBC Archive Indexerbot, and ages ago helped out with some of the development of that bot and the HBC AIV Helperbots. Well, I've been horrifically inactive and unhelpful here on Wikipedia for ages now, and have decided it's finally time to try to see if someone else wants to take over the operation and maintenance of HBCAI, rather than it continuing to languish and get repeatedly ignored by me. This seemed the most logical place to suggest that, but please let me know if anyone thinks there's a better place to move the discussion.

HBCAI is written is Perl and designed to run on a UNIX/Linux-like system. I've been running him on FreeBSD, but he should work just as well on any *NIX really. I'd strongly recommend that whomever takes over this bot be reasonably well versed in both Perl and *nix administration, as getting it up and running on a new system may be a bit of an adventure, and I'm afraid I probably won't be much help due to time constraints. The bot uses considerable CPU and RAM, between 1 and 2 GB of RAM active during a run. I've been running it 2x/day, and each run lasts close to an hour, if memory serves.

The source is available on the wiki via the bot's user page, but I'll be happy to provide a bundle with the exact sources including Mediawiki.pm that I'm using, as it's somewhat finnicky about that sort of thing. To be honest I'm not even 100% sure it's working at all at the moment; it seems to break periodically when things change in the Mediawiki software. It's really not a bad bot to run, but I'm just so out of the loop on all things Wikipedia that it's too much effort to try to figure out what's up every time something breaks, and I'm terribly non-responsive on my talk page, and it makes me feel like a jerk. Plus the whole thing really could use a total rewrite, or at least some serious TLC, because it hasn't had much in the last five years or so.

I'll try to check in on the discussion here, if you want to volunteer specifically, though, please also drop me an e-mail via my user page so I know to check in and I'll try to be reasonably responsive! Thanks! —Krellis (Talk) 00:21, 23 July 2012 (UTC)

Hmmm. Unfortunately, the amount of memory the bot uses precludes it being run on the Toolserver.  madman 03:03, 23 July 2012 (UTC)
I guess that also explains the Forbidden 403 error thats generating for the Article assessment statistics. Kumioko (talk) 03:20, 23 July 2012 (UTC)
That's possibly but unlikely to be related to memory usage. On the Toolserver, if a process is consuming 1 GB or more of memory, it will be killed by the "slayer daemon". This does not tend to affect the Web server (on which few processes are scheduled) or directly result in HTTP response codes in any way. — madman 03:26, 23 July 2012 (UTC)
Good to know thanks. Kumioko (talk) 03:33, 23 July 2012 (UTC)
Try Labs. The message I took away from my conversations with Labs people at Wikimania was that they have a massive amount of unused resources and that as long as the idea is sane, they're game for just about anything. Sven Manguard Wha? 04:16, 24 July 2012 (UTC)
I'd be willing to work on the bot, but since I don't know perl I would rather re-write it in Python. LegoKontribsTalkM 20:02, 23 July 2012 (UTC)
I'd have no objection to that, but I suspect it would require a new account and bot approval in that case. I wouldn't think it would be TOO terribly difficult to have it support HBCAI's existing formats and stuff if you were so inclined. But overall it's probably a lot more work than someone who knows Perl taking it on as-it-is, FWIW. :) —Krellis (Talk) 20:35, 23 July 2012 (UTC)
I had some free time so I wrote up a new script and tested it my userspace (see User talk:Legoktm/Index). I've tried to write it to ensure that everything currently supported still works, but I also added in a few features of my own (<month> and <year> support). How does it look? LegoKontribsTalkM 02:28, 24 July 2012 (UTC)
Your method naming conventions do not comply with the PEP 8. Σσς. 02:33, 24 July 2012 (UTC)
Currently I'm more concerned about the bot functioning properly than following PEP 8 standards. I know that it is supposed to be lowercase, but my personal preference is mixedCase. LegoKontribsTalkM 08:28, 24 July 2012 (UTC)
I don't have sufficient time or Python expertise to do a detailed review, but if it appears to work I'd certainly be happy to support giving it a try, either under the existing account or under a new account with a new bot approval request, whatever you feel most appropriate. —Krellis (Talk) 18:15, 25 July 2012 (UTC)
I've filed a BRFA since I think that would be easier for maintaining and debugging if I ran the bot. LegoKontribsTalkM 23:52, 26 July 2012 (UTC)
Sounds good, I've weighed in with my support over there. —Krellis (Talk) 00:01, 27 July 2012 (UTC)

Pending changes

Hi -Could anyone here tell me what effect adding this protection tool had on our vandal bots/edit filter? Is it possible for bots/edit filter to search the pending edit queue and reject a "desired addition" that is not yet reviewed and not yet added to an article ? Youreallycan 18:24, 26 July 2012 (UTC)

It has no effect. The pending changes "tes" was shut down. The community could not get a consensus on how or if they wanted to use it so they disabled that functionality. Kumioko (talk) 19:22, 26 July 2012 (UTC)
It's scheduled to be re-enabled in December, see Wikipedia:Pending changes/Request for Comment 2012. Anomie 20:02, 26 July 2012 (UTC)
As far as I know, it didn't affect the bots' operation in any way: they processed edits in the same way they always did.
For that matter, all Pending changes protection does is show an older revision of the article to non-logged-in users by default instead of the current version and gives certain users the ability to choose which older revision that is. Anomie 20:02, 26 July 2012 (UTC)
  • Thanks for your comments - @User:Anomie," As far as I know, it didn't affect the bots' operation in any way: they processed edits in the same way they always did" - Hi , I was active on the trial and I do not remember the vandal bots/edit filter actioning in regards to . at all on desired additions that were not "live" and published via the en wikipedia project - Are you asserting the vandal bots/edit filter were actioning on desired additions that were not already published using the en wikipedia project? - Youreallycan 21:29, 26 July 2012 (UTC)
    • Yes, they would have been, as they are logged-in users and see those additions in recent changes. — madman 20:50, 28 July 2012 (UTC)

I would expect bot operation to be unafffected by pending changes. For me, the issue is that sometimes we would want the bot operation to be tweaked slightly. For example, if an anti-vandal bot reverts an edit and the version before the reverted edit was approved then we want the bot to approve its new version. If the version before the vandal edit was not approved then we would want the bot to leave its new version unapproved also. Yaris678 (talk) 11:57, 30 July 2012 (UTC)

IIRC, that particular case would happen automatically, because reverting to an approved revision would automatically be approved but reverting to a non-approved revision would not. Although I don't recall if this required the "review" right, or if "autoreview" was sufficient. Anomie 00:44, 31 July 2012 (UTC)
It doesn't happen automatically. For example, look at this history. The last version by "Yaris678 test" was accepted by a reviewer, rather than automatically, despite being identical to an accepted version by "The general user".
In terms of rights, Autoreviewer is the old name for Autopatrolled and does not relate to PC. If you have the reviewer right then your edits are automatically accepted if the previous version is accepted. If the previous version is not accepted then it asks you to review the edits. I think we need to give bots the reviewer right. For ones that update categories and that sort of thing nothing more will be required. The edit will be automatically accepted if the previous version was accepted. For anti-vandal bots we would need to make them inspect the version they are reverting to. If it is accepted then they should accept their new version, if it isn't, they shouldn't.
Yaris678 (talk) 11:52, 31 July 2012 (UTC)
Your test account is not autoconfirmed, so it does not even have the autoreview right, so its reversion proves nothing. As for the defunct Autoreviewer group, that is different from the autoreview right. Anomie 03:22, 1 August 2012 (UTC)
So the autoreviewer right and the autoreviewer group are different? Can you point to some documentation that explains this? Can you do a test that shows a version by a non-reviewer (with any rights you fancy) being accepted because it is identical to a version that that has already been accepted? Yaris678 (talk) 12:25, 1 August 2012 (UTC)
User rights and groups related to FlaggedRevs (i.e., pending changes) are documented here. — madman 13:49, 1 August 2012 (UTC)
Thanks. So to translate that into pending changes language, if you are confirmed or autoconfirmed, your change doesn't need to be reviewed (under PC level 1). This "right" is called "autoreview". Would someone who is confirmed or autoconfirmed (but not a reviewer) like to repeat my test but doing it slightly differently? Create a test account, as I did, and edit one of the pages in Wikipedia:Pending_changes/Testing with it. Revert that edit with your usual account and see what happens. Yaris678 (talk) 14:53, 1 August 2012 (UTC)

RM bot

The issue is being taken care of: Wikipedia:Bots/Requests for approval/RMCD bot. Σσς. 04:52, 11 August 2012 (UTC)

pywikipedia

Hi all,

I've been playing with pywikipedia a little bit in - and so far using it only to read and parse pages (which has been really quite useful for a number of things). I'd like to move towards using pywikipedia to make changes in a 'approved by human' way.

Now, it's trivial for me to, say, print out the original wikitext of a page/section and then print out the proposed new text and ask the user at the command line if they approve the change - but it would be much more useful/fancy, if when the pywikipedia script had an edit that it wanted to make, it opened up a browser window and gave a preview page that the editor could view. My question is: is that sort of functionally buried anywhere in the pywikipedia librarys? and if not are there any approximations I could use? Fayedizard (talk) 07:55, 12 August 2012 (UTC)

You will need to use a tricky library such as selenium if you want to have that degree of control over a web browser. Python's standard webbrowser module is simple, but offers basic control. Σσς. 08:44, 12 August 2012 (UTC)
ah, clever, I use selenium from java - it hadn't occurred to me I could play it though python (obvious in hindsight) - thanks! :) Fayedizard (talk) 09:35, 12 August 2012 (UTC)
I've also written a shim for making web tools out of pywikibot scripts. See tools:~dispenser/sources/htmlput.py. — Dispenser 13:38, 12 August 2012 (UTC)

VIAFbot - approved?

Is VIAFbot (talk · contribs) an approved bot or in the process of getting approval? I see a lot of test edits from it today, and it looks like it may be a port from or otherwise related to de.wikipedia. I'm not an expert on bots; that's why I'm asking here before acting. —C.Fred (talk) 21:28, 16 August 2012 (UTC)

No, it's not. Blocked. Anomie 02:09, 17 August 2012 (UTC)


See Also

Hi all,

I'm considering the idea of a bot that looks at 'See Also' sections of articles, and does things like remove elements if they are already in the main article (per the "As a general rule the "See also" section should not repeat links which appear in the article's body or its navigation boxes" part of Wikipedia:Manual_of_Style/Layout#See_also_section). I enjoy writing code and it would be quite nice to write a bit of python that works it's way though a wikiproject and presents some edits to the bot runner for approval. I'm interested to know if a) this is reasonable functionality for a bot and b) if other bots already have this capability. Fayedizard (talk) 11:48, 21 August 2012 (UTC)

I don't think it necessarily would be a good task for a bot, because linking to an article and dealing with its subject aren't the same thing. That said, I don't want to dampen your enthusiasm and there probably is a good task out there for you :) - Jarry1250 [Deliberation needed] 12:34, 21 August 2012 (UTC)

Commons fair use upload bot

I have blocked the seemingly unapproved Commons fair use upload bot (talk · contribs), and opened a discussion at the incidents board. Input from those familiar with bots/bot policy would be appreciated. J Milburn (talk) 15:31, 24 August 2012 (UTC)

Assessment

Is there something wrong with the bot used for giving updates with assessments in WikiProject? WP:LT/A has not been updated for over a month. Also, manually accessing the bot through toolserver is apprently forbidden, according to this. Simply south...... flapping wings into buildings for just 6 years 15:40, 26 August 2012 (UTC)

Yes, the bot was down for a few weeks. It started working again today. See here for latest status. Ganeshk (talk) 16:20, 26 August 2012 (UTC)

Did someone seriously approve a bot to spam people?

The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.


I just received a newsletter that I never asked for The Olive Branch: A Dispute Resolution Newsletter (Issue #1), and whoever is behind it is forcing editors to add their names to Wikipedia:Dispute Resolution Improvement Project/NewsletterOptOut to stop being spammed. I am not involved in this process, I do not want to be involved, and I should not have to be put on some list of shame for something that some other editor thought was important. I can't believe that the process actually approved a newsletter that was opt-out instead of opt-in, that's crazy. ▫ JohnnyMrNinja 19:22, 4 September 2012 (UTC)

EdwardsBot is a bot that is designed to send out newsletters based on a request from an approved user. Based on some digging, the list for the DRN newsletter is located at Wikipedia:Dispute Resolution Improvement Project/NewsletterList. You probably want to talk to Ocaasi who sent out the newsletter. LegoKontribsTalkM 19:32, 4 September 2012 (UTC)

Seriously. Participating in a WP board is not an opt-in to be spammed. Not okay, and I'm surprised that there isn't anything in guidelines forbidding users from signing up others to receive spam without their permission. –Roscelese (talkcontribs) 19:44, 4 September 2012 (UTC)

Irony is a "dispute resolution" newsletter causing strife and disruption, but here we are. As with others, I agree that newsletters should always be opt-in only. This is, however, not a fault of EdwardsBot. It is a poor process decision on the part of the newsletter's organizers. Resolute 19:46, 4 September 2012 (UTC)
I, too, am unpleasantly surprised at being spammed on the dubious grounds that I am apparently the 441st-most-frequent-poster at WP:ANI. I've left a note to User:Ocaasi communicating that, I hope reasonably politely. -FisherQueen (talk · contribs) 19:49, 4 September 2012 (UTC)
Yeah, let's have them blocked for mass disruption of the project.TMCk (talk) 19:50, 4 September 2012 (UTC)
First let me apologize to anyone who is upset. A few precautions were taken to minimize the disruption here:
  • The full newsletter was not sent, only a link to the newsletter
  • Only editors who were highly and recently active in dispute resolution received the link
  • An opt-out list was provided immediately
  • The issue was raised at the Administrators' Noticeboard for Incidents prior to the mailing: link to discussion
I hope that mitigates some of the frustration here. I'll wait for more people to comment. Ocaasi t | c 19:52, 4 September 2012 (UTC)
"Only editors who were highly and recently active in dispute resolution received the link."? That's complete bull.TMCk (talk) 19:54, 4 September 2012 (UTC)
Calm down. He spammed an extract from a new newsletter to people who didn't ask for it. He didn't moon Jimbo or delete the main page. AGK [•] 19:56, 4 September 2012 (UTC)
(ec)He (or they?) forced me to opt. out of being spammed in the future. That's not ok.TMCk (talk) 20:06, 4 September 2012 (UTC)
Seconded. I understand that contributors may need a venue to express frustration regarding how this bot has been utilized right now, but please keep it civil or the discussion will be closed. The bot's approval is not the issue here. — madman 20:01, 4 September 2012 (UTC)
Well, depends on how you want to define "highly and recently active". 50 ANI posts since January 1? That bar might be a little low. I'm actually surprised that is how I got on the list (and I' 208th, beat you FisherQueen!) rather than being involved in two arbitration cases this year. Overall, I think this was well meaning, but I do think that the onus should be reversed. The newsletter went out, everyone knows about it. Those who want it can now opt-in. I certainly agree with ideas like promoting it through the Signpost - Try to get some space within the Signpost's arbitration section for brief overviews of what this newsletter intends to convey. Resolute 19:59, 4 September 2012 (UTC)
(edit conflict × 2) This is really not acceptable. AGK [•] 19:55, 4 September 2012 (UTC)
AGK, can you be more specific about what you dislike about the list or otherwise? I recognize that people hate feeling spammed, but I thought that list was actually a constructive effort to target editors more narrowly. Ocaasi t | c 20:04, 4 September 2012 (UTC)
Narrowly? Based on number of edits to ANI?·ʍaunus·snunɐw· 20:11, 4 September 2012 (UTC)
Not only that, but based on a rather low number of edits over a 20 month period. A person needed only to average 2.5 edits per month to ANI to end up on this list. That is not exactly narrow. And then there is the rather amusing number of retired and banned editors on the list. Resolute 20:14, 4 September 2012 (UTC)
Hopefully it's obvious that this was a mistake. Please delete both user lists and in future do not create opt-out lists. ▫ JohnnyMrNinja 20:06, 4 September 2012 (UTC)
  • Opt-in not opt-out. We have to actually debate this internal spam? With the excuse of "Lack of awareness - if no-one knows about it, they won't know to opt-in" [1]? You could use that excuse for anything you want to advertise on the project. The linked supposed approval discussion had four participants discussing it for one day. That's it. And this was justification for spamming 985 recipients? Umm, no. Shut down the opt-out list, make the whole thing opt-in for the next newsletter (if there is one). --Hammersoft (talk) 20:11, 4 September 2012 (UTC)
    Of course now there is no more lack of awareness; people already know about it, because they were spammed. So just add an opt-in to the newsletter itself, and everything is resolved. I was very irritated by the spam, but I don't think it was anything but good-intentioned. Again, I hope the mistake itself is not in question, just the outcome. ▫ JohnnyMrNinja 20:15, 4 September 2012 (UTC)
    Yes, you're right. I have no problem doing opt-in from here on out. I am only wondering if once the majority of people have expressed their frustration and added themselves to the opt-out list, if the remainder would not mind receiving it. But, needless to say, we won't send out another edition again until we know we have this resolved to everyone's satisfaction. Ocaasi t | c 20:28, 4 September 2012 (UTC)
    Did you know that, according to this survey, half of violent crimes in the US are not reported? You can't assume that somebody wants something because they never told you they didn't after you forced it on them. ▫ JohnnyMrNinja 20:42, 4 September 2012 (UTC)
    Fair point. Assuming that only those who were vocal about their opposition were opposed to it might miss the mark. Ocaasi t | c 21:23, 4 September 2012 (UTC)
  • First, I would change the first paragraph of Wikipedia:Dispute Resolution Improvement Project/Newsletter to reflect the future of the newsletter being opt-in only, with a link to a page where people can opt-in, perhaps Wikipedia:Dispute Resolution Improvement Project/NewsletterOptIn. Second, before you release another newsletter, consider having other eyes look at it before release. There are a number of copy editing errors in this copy that would have easily been picked up by another set of eyes. Third, before you attempt any other sort of new ideas from the Wikipedia:Dispute Resolution Improvement Project that will impact more than just those of you interested in the project, start an RfC or use some mechanism to gauge community consensus for the action before undertaking that action. This was a half baked process, and you have no doubt caused a significant amount of animosity to be brought to bear against your efforts. A sampling of the edit summaries at the OptOut history should be enough to motivate you to not get this wrong next time. --Hammersoft (talk) 20:36, 4 September 2012 (UTC)
  • Also: adding names to the opt out list misses the point entirely. Shut down the list, make the newsletter opt-in only from this point forward. While this edit was less than civil, the intent is spot on, and I've removed myself from the list too. I am confident you will not make the same mistake of sending out spam again, so this list is unnecessary. --Hammersoft (talk) 20:39, 4 September 2012 (UTC)
Hm... I was told that saying "thank you" was very civil. I'll look into this. ▫ JohnnyMrNinja 20:46, 4 September 2012 (UTC)
Folks please, This really isn't that big of a deal. They sent a newsletter. If you don't want it just ignore it or delete it. Its no different than a user leaving one of those useless love kittens from the Heart tab. It really doesn't deserve this much attention or agitation. Kumioko (talk) 20:38, 4 September 2012 (UTC)
  • I'm not aware of anyone sending out 985 useless love kittens from the heart tab :) --Hammersoft (talk) 20:39, 4 September 2012 (UTC)
  • The problem with mildly irritating hundreds of people is that it distills into a large amount of total irritation. Also this is a really bad precedent. The fix seems simple, as I've mentioned above, but it seems that not everyone is on-board. ▫ JohnnyMrNinja 20:50, 4 September 2012 (UTC)
Ocaasi does not seem to be getting the message. If it happens again, it will be a big deal. Delicious carbuncle (talk) 20:46, 4 September 2012 (UTC)
Hi Delicious carbuncle. I've taken people's criticism very seriously here. We will not do things the same way again. What am I missing? Ocaasi t | c 21:06, 4 September 2012 (UTC)
Ocaasi, I'm sure your intentions are good, but statements like "I am only wondering if once the majority of people have expressed their frustration and added themselves to the opt-out list, if the remainder would not mind receiving it" make me think you have not understood what people were saying about doing things in this way. Delicious carbuncle (talk) 14:11, 5 September 2012 (UTC)
That was one quick response he made while this was still going on, if you read above he later says "Assuming that only those who were vocal about their opposition were opposed to it might miss the mark"; I do not think there are any more lessons to be learned here. ▫ JohnnyMrNinja 21:04, 5 September 2012 (UTC)
Thank you Ocaasi for removing me from the list. Add me to the list of people who think this list should not exist. Opt in. Opt in. Opt in. Don't spam people with shit they didn't ask for. --OnoremDil 20:47, 4 September 2012 (UTC)

Seriously, Ocaasi, if you write an apology at the top of the newsletter and add an opt-in list there may still be time to save it. At this point most of the people haven't logged in yet. This is going to create a negative impression of the newsletter that will take a while to wear off, if it survives. People resent things forced on them, even if they would normally like it. Accept that there will not be another opt-out mailing and move on. ▫ JohnnyMrNinja 21:02, 4 September 2012 (UTC)

Good idea. I've added a note to the top of the newsletter. Thanks, Ocaasi t | c 21:06, 4 September 2012 (UTC)
Thank you Ocaasi for acknowledging the mistake, that was the only part that was really frustrating me. It seems fitting that the inaugural issue of a dispute resolution newsletter require dispute resolution. ▫ JohnnyMrNinja 21:23, 4 September 2012 (UTC)
Okay, now that Ocaasi has agreed to this, it should be resolved. What we had here was an editor who was being bold and appears to have made a poor choice. Now, they appear to have recognized their error, so let's assume good faith and try to find a way to recover from the disaster in which some of us had an unsolicited newsletter linked to on our user talk pages. AutomaticStrikeout 21:11, 4 September 2012 (UTC)
  • Opt-OUT please. TY. — Ched :  ?  21:25, 4 September 2012 (UTC)
  • Probably not necessary to pile on here, but yea, make it opt-in, don't force me to opt out of smething I never asked for in the first place. Beeblebrox (talk) 21:40, 4 September 2012 (UTC)
  • Yep. Let's all say it together now folks. " A.... G.... F. :-D — Ched :  ?  21:56, 4 September 2012 (UTC)
  • To end on a lighter note: sending the announcement to ANI regulars meant that SineBot got a notification. JohnCD (talk) 22:18, 4 September 2012 (UTC)
  • When the bots start talking among themselves, I think we human editors could be excused for feeling nervous. How long before they decide they can do a better job without us? JohnCD (talk) 22:39, 4 September 2012 (UTC)
  • Having fought the bots in a previous conflict, I'm pretty certain that we can survive... for now. --Chris 01:18, 5 September 2012 (UTC)

As I just posted at EdwardsBot, I'd really like to see an RFC or some other similar process implemented to gauge consensus about when it's appropriate (or inappropriate) to send messages like this. And when messages should be opt-in, opt-out, or otherwise. And how often it's appropriate to send out messages.

These are important questions and I'm perfectly happy to see the bot blocked or its access list wiped clean if these issues can't be resolved reasonably. I think there is some utility in having such a message delivery mechanism, but it shouldn't be causing so many editors to be annoyed. --MZMcBride (talk) 01:31, 5 September 2012 (UTC)

P.S. And, of course, the name of the bot's configuration page ("User:EdwardsBot/Spam") is completely tongue-in-cheek.

I'd like to say that I started this thread thinking that BRQ had approved the process. Reading back now it clearly reads as me being uncivil (read "douchey") towards a single editor who did not deserve it (for which I've since apologized through email). That being said, I think it would be clear that these sort of scraped lists should not be used with opt-out mailings. If there is a list that editors added themselves to, like a WikiProject membership list, then maybe that mailing list could be opt-out. Or maybe a scraped list could be used for a one-time mailing, with follow-ups being strictly opt-in (as is now the case for this newsletter). But a scraped opt-out mailing list as was used today will always cause issues, so I think that should not be done again. Blocking the bot or disallowing the editors is not needed, as there was no abuse, just a mistake that has been fixed. ▫ JohnnyMrNinja 02:40, 5 September 2012 (UTC)
The discussion above is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.

Ignored question, slam approval

I posted a legite question about a bot.[3] My question was ignored. This bot does not need to create stubs any faster. It creates 100 stubs at a time, and each stub is supposed to be verified by a member of the project; unless the project suddenly gained a dozen new really fast snail editors, there is no reason for this bot to increase its stub-creation output.

Or if there was a reason, it was sure not readily available in answer to a question by a community member.

What is going on? Why did this bot have to be approved to create stubs at 5X its current rate? How is Wikiproject:Gastropod handling the approval of these bot-created stubs. The community has spoken a number of times about bot-created stubs, and not usually favorably. This bot operator has "misinterpretted" prior approvals to mean something entirely different from what was intended. This is not a bot and operator to be speedy approved when a community member has asked a question.

I would like the discussion re-opened, and the question answered. I don't care how old the operator is.

68.107.140.60 (talk) 03:14, 9 September 2012 (UTC)

I'm sorry you feel your question was ignored; I couldn't see that it was, as it was answered by the bot's operator. It seemed fairly obvious to me that the increase in the bot's output was to avoid having the bot be the bottleneck in the approval process for the articles. If you review the link to WikiProject Gastropod's talk page in the request, you can see that the contributors to that project have been able to easily handle the bot's current output; a good amount of time is spent waiting on the bot to create more articles. If you look at the current talk page, you can see some support and no opposition for the increase in output; this support was expressed in the request as well. I can't see any way in which this simple request for a change in parameters could be misinterpreted to imply approval of additional tasks, and I couldn't see any reason not to approve such a request. I understand your concern regarding community input and I acknowledge that I could have waited longer, but the request had been open for ten days. Ultimately I judged the pre-existing and current consensus to be well in favor of approval and I closed the request as such. Thanks, — madman 04:13, 9 September 2012 (UTC)
I couldn't see any way that task 3 or 4 could be interpretted as approval for 10,000 articles, yet it was. This makes no sense that a bot with this task is a bottleneck compared to human editors checking the articles. Something fishy is going on here, or there is something wrong with the programming, and the bot requires more scrutiny.
It had been open for ten days, so you had to quickly speedy approve when I asked a question? Why? I do not approve of this task at all. If you have any programming experience, think about what you are telling me when you say the bot is the bottleneck. 68.107.140.60 (talk) 04:42, 9 September 2012 (UTC)
I have eleven years of programming experience and I can easily see how larger batches are going to be more efficient than smaller batches. It took ten months for the bot to generate about 2k articles, stopping after each batch of 100. I will say, however, that your point regarding the timing of the approval is well-taken. However long the request had been open and regardless of how your question was answered, I should have waited at least another day to see if you had any follow-up questions before closing and I apologize for having failed to do so. — madman 05:09, 9 September 2012 (UTC)
While certainly unusual to have a BAG member apologize, an apology generally has to come with making amends, and you have done nothing to fix the situation. It should not be taking a bot this long to create the articles, and if it does, that bottleneck is not broken by allowing the bot to create more articles, as, if it is the bot's slowness at issue, the bot is not functioning correctly. Your apology also does not address the issue of this particular bot and bot operator being the incorrect ones to get speedy approval. If it is your job to make the judgment call, then you made it wrongly, and I ask you to rectify it. I would like to know exactly what is going on with this bot that it took the bot forever to create the stubs, and you and the bot owner seem to think that letting the bot create more is the solution. 68.107.140.60 (talk) 05:31, 9 September 2012 (UTC)
(edit conflict) You haven't really suggested any way in which the situation could be rectified to your satisfaction, except to re-open the BRFA, and I'm not sure that's the best solution. I think you're hypothesizing "something fishy" is going on when it's not, and a good deal of your objections seem to focus on the operator rather than on the task. I'm willing to re-open the request and move this discussion there, but absent a concrete reason to do so, I'd like to wait for the input of a BAG member who hasn't been involved to determine whether I did make the wrong judgment call and if so, what the right call is now. There's no deadline and I'm sure this situation will benefit from my fresh perspective in the morning. Cheers, — madman 06:15, 9 September 2012 (UTC)
By the way, did you verify that GaneshK was in compliance with the last task? That he wasn't making 141 bot stubs, instead of the allowed 100, in August 2012, say? Maybe we can explain that also, and define for the community what number 500 actually means? Or what it means when the bot is approved for 100 at a time, but creates 141 instead? Does 100 mean 100 +/-50, so 500 means 500 +/-50 or 500 +/-250? -68.107.140.60 (talk) 06:13, 9 September 2012 (UTC)

As usual, a Wikipedian quotes an essay he has not read and that does not apply to the situation. Thanks. Much appreciated. My focus is on the task, as done by this bot operator who does not seem to think the rules apply to him. So, fresh start, let's get an explanation for how I tell the bot owner is in compliance, when it appears that he thinks 100 means some other number, 141, or, I may be wrong, but I was not allowed to discuss the situation because you heedlessly speedied the closure of the BRFA. Let's get an exact explanation of what the bottleneck is, also, because if it is the slow speed of the bot, something is not right, although with your 11 years of experience, I wonder that you think one can cure a badly written slow program by giving it more to do. Maybe it has something to do with the count variable, and that would explain the 141/100 computer approximation.... 68.107.140.60 (talk) 07:02, 9 September 2012 (UTC)

The sky is falling! 41 edits over the limit! Wikipedia is doomed! More seriously, your comments have crossed over the line and have ceased to be constructive (see WP:PERSONAL). Focus on the edits, not the editors. Boghog (talk) 08:02, 9 September 2012 (UTC)
Oh, so the rules aren't rules? And, actually, part of running bots is the community's confidence in the editor; there's a neat little thing that discusses this, about how bots can make a bunch of edits, etc., and see also how bots are rejected when there is no such confidence in the operator. Nice try with the bots can't count. Now, back to the discussion. 68.107.140.60 (talk) 08:29, 9 September 2012 (UTC)

  Note: There is a lot of history being brought up here. On one side, Ganeshbot was approved in Ganeshbot 4 to create about 580 articles for the genus Conus, limited to 100 per month. In Wikipedia talk:Bot Approvals Group/Archive 7#Wrong way of the close a BRFA, the 100-per-month limit was lifted. Somehow or other, the members of WikiProject Gastropods thought they were allowed to have the bot create 15000+ articles for other gastropods without further approval or any rate limiting. This, understandably, caused much consternation. Ganeshbot 5, asking permission to finish creating these tens of thousands of articles, eventually was denied for no consensus; Ganeshbot 10 was eventually approved with the rate of creation limited to the rate of review by the project.

On the other side, the IP user 68.107.140.60 seems to be the same user who has been around for the anybot mess (see pretty much all of Wikipedia talk:Bots/Requests for approval/Archive 4) and other related discussions. If it is the same user, he/she serves a valuable function in watching bot activity and approvals generally related to "species" articles with a critical eye, but this is counterbalanced by the user being extremely sensitive to perceived slights against IP editors and being generally quick (to the point of disruption) to throw around accusations of being ignored or suppressed, of being on the receiving end of incivility, and of editors in "power" being biased against their viewpoint. Anomie 08:38, 9 September 2012 (UTC)

Yeah, I'm in one of my bad mood guises; I was asked to check this out, and I thought it looked okay, then all of a sudden the speedy closure! And this on top of a recent range block and name calling by an inflated admin, who eventually had to apologize and restore my edits which had removed the bad science he was married to; what it is about climate change that makes both sides promote bad science? So, yeah, I get pissed off when my students turn in crap science plagiarized from Wikipedia. So, yeah, I'm back, and this did not seem like much, this bot running increased numbers, but I was willing to discuss it, check it out, see what was going on, until I got the run around, which, yeah, I tend to be disruptive about getting the run around. It bothers me. Hope all has been well with you, Anomie. -68.107.140.60 (talk) 17:42, 9 September 2012 (UTC)
Welcome back, we need more people who watch out for bad science. Just remember to assume good faith when dealing with people who don't have a track record of doing things wrong. Anomie 18:08, 9 September 2012 (UTC)
It would be easier not to, if we had kept this conversation at the RFBA. I really think BAG does poorly and continues to do poorly in responding to community input, and this is the case for many bot operators also. And some problems arise out of this. I've never really left. -68.107.140.60 (talk) 18:16, 9 September 2012 (UTC)
IP, I will try to respond to some of your questions here. The bot has been creating good stubs for over 10 months months now. The request for increased limit was for getting this task done faster. I will still wait for a project member to thoroughly review the stubs before I create the next batch of 500. The bot process is a two step one, 1. Data extraction into flat files. 2. Run AWB to create articles. I monitor every AWB run very closely. I had created 141 stubs instead of 100 in the last run because that was the last batch of the family and did not want to schedule another run for just 41 stubs. I am now creating stubs under Turbinidae family; the limit is back to 100. Ganeshk (talk) 12:29, 9 September 2012 (UTC)
Please act like I am stupid, and explain this to me. It seems you create 100 and wait a few days while they are checked. Now you want to create 500, then, what, wait a few more days? Why? Your answers are not adding up to understanding why you need to create more at a time.
I think also, that for a bot operator who has, in the past, had a problem with what a task was approved for that caused serious community uproar and time consumption, you should not be thinking that the task approved means anything other than the task approved. If the task is approved for 100, it is my opinion that that is a hard number, in your case. -68.107.140.60 (talk) 17:40, 9 September 2012 (UTC)
The increased limit will help me cover more ground in each bot run. I have little time to spend here outside of RL and would like to use that efficiently. I agree that the 41 stubs were over the agreed limit. I think BAG will agree that there is some flexibility for small changes like that. My talk page is always open for any suggestions you may have. Ganeshk (talk) 18:37, 9 September 2012 (UTC)
Does this mean you want to create more at a time as a convenience to you? (Nothing wrong with that.) This is different from dealing with a bottleneck that does not exist; it eliminates my questions about the bottleneck, which was poorly explained (if at all) by you and the BAG approval. I think transparent communication and community involvement are important when creating articles of this nature. I would request that you consider 500 a hard number, in spite of the sarcastic comments about no rules. If the sole purpose is a matter of convenience for you, and the same procedures are followed by Gastropod members, I see no issues with this bot. But, if there is something nefarious going on, this imaginary bottleneck or bad code that made no sense, then, maybe I will be checking into things more closely. I would like this conversation added to the task approval, where it belonged in the first place. 68.107.140.60 (talk) 05:13, 11 September 2012 (UTC)
Thanks. I have linked this discussion here, Wikipedia_talk:Bots/Requests_for_approval/Ganeshbot_11. Ganeshk (talk) 11:55, 11 September 2012 (UTC)

This is just about an understanding of this bot operator's request. Let's not make it about me or anything else. -68.107.140.60 (talk) 17:42, 9 September 2012 (UTC)

I have read the essay and I like the spirit of it. But that's not the point. The point is that when a bot operator has been operating within the bounds of his approval for almost a year, I think it's rude to suggest that he "doesn't think rules apply to him". It seems you're not afraid of being rude, judging by your edit summaries I didn't see last night ("time for irc secret cabal action" indeed). I do think you raise some good points about how the task may currently be executed and the timing of the closure. I'm interested in the implementation of the task too. But your incivility disrupts the discussion, it does make it all about you, and it erases any goodwill others may have toward you and any hope the discussion will be resolved in your favor. If you're assuming bad faith on the part of the operator because of an incident that occurred in 2010, I think it's time to let it go now. Contributors (even bot operators!) make mistakes. If you're assuming bad faith on the part of the operator because he made extra edits in order to finish his last run, I think that's ridiculous and would remind you that the fifth of Wikipedia's five pillars is "Wikipedia does not have firm rules." — madman 18:10, 9 September 2012 (UTC)
In conclusion, we don't need BAG's bag of rules. Ganesh slides around rules, and I have an issue with that; but, he also listens, and closing his BRFA early left shut the door on the best room for discussing and clearing up the rules before a mess is made. The bot rules. -68.107.140.60 (talk) 18:16, 9 September 2012 (UTC)

Bot Code

I don't have enough coding experience to make a bot, but I was wondering if someone could make the code for a bot that automatically posts a message on a person's talk page after they make one edit. This is for a different wiki where I have been asked to make a bot that does that.ad Intellige ad nuntius 02:26, 16 September 2012 (UTC)

Pywikipediabot has welcome.py which can be used for that task. LegoKontribsTalkM 02:43, 16 September 2012 (UTC)
Thank you, this just made my life about ten times easier :) ad Intellige ad nuntius 21:30, 16 September 2012 (UTC)

Dissertation including Bot Operator Interviews now available

Hello everyone-

I wanted to let you know that my dissertation, "Network of Knowledge: Wikipedia as a Sociotechnical System of Intelligence" is now available on my website with a CC BY-NC-SA 3.0 license. Over a year ago I began this project with the WMF Research Committee and the University of Oregon IRB's approval. Nearly 50 bot operators, WP contributors, and WMF administrators were kind enough to participate in the study, offering their time, opinions, and expertise on issues around bots and bot creation. Feel free to download the document or peruse it online, and I look forward to your comments either on the site or via email.

The manuscript is a bit long (~320 pages) and includes some standard dissertation sections (literature review, methods chapter, etc.). Interviewee contributions are featured most in Chapters 5 and 6 (if you want to skip to the good stuff).

I am at a new institution now and will be going through a new IRB approval process to continue this research, but I do indeed want to continue chatting with the bot and semi-automated tool community. Please let me know if you're interested in connecting this fall, and thank you so much to those who have already participated!

Randall Livingstone UOJComm (talk) 23:55, 20 September 2012 (UTC)

Why no bot-created tag on articles

Why is there no "this article created by a bot tag" on the article? And, no, I don't give a dang about the edit history. Was this a decision made, or has it never been discussed? 68.107.140.60 (talk) 01:56, 16 September 2012 (UTC)

As far as I can see, it's simply never been discussed. — madman 04:08, 16 September 2012 (UTC)
Then let's discuss this at a community board or add the tag. 68.107.140.60 (talk) 03:29, 17 September 2012 (UTC)
This is a community board. And you can request that the operator add such a tag, but it's not going to be considered a retroactive condition of operating a task that's been approved for almost a year. — madman 04:29, 17 September 2012 (UTC)
I certainly would support making it a retroactive condition (i.e. making the bot operators go back and tag the articles). Mind you, I'm very strongly opposed to bot created articles, because I've yet to see them not be heavily neglected utter shit, so at the very least having a tag somewhere would mean that people could find easy targets for articles desperately in need of improvement. Sven Manguard Wha? 18:32, 17 September 2012 (UTC)
I tend to agree with Sven's sentiment here.·ʍaunus·snunɐw· 18:34, 17 September 2012 (UTC)
I do not think we need to add another useless comment banner to articles that already have too many. I would support either a category noting the article wsa bot created or a banner on the talk page, maybe something akin to the Template:WikiProject Articles for Creation. Kumioko (talk) 18:51, 17 September 2012 (UTC)
I think it makes more sense to have a tag on the article itself which can be removed once the article has been patrolled by a human editor.·ʍaunus·snunɐw· 18:57, 17 September 2012 (UTC)
I do maintain a list of the articles on the Wikipedia namespace for project members to review and update (see Turbinidae). Retroactive updates will be difficult since some articles may be edited by humans. I am okay with adding a template to new articles going forward if the community wants it. Ganeshk (talk) 21:13, 17 September 2012 (UTC)
If the WikiProject members are okay with removing such a tag from all articles they review (they can of course leave it if it's on the talk page), that sounds like the best plan. To those above who support making it a retroactive condition, my point was that there's no provision in the bot policy to do so unless you're suggesting the existing approval should be revoked. Thanks, — madman 22:37, 17 September 2012 (UTC)
As Kumioko mentioned, it could also be implemented via a hidden tracking category on the article itself like Category:Articles created via the Article Wizard. That ought to be even less noticeable to the casual reader/editor than the other options. VernoWhitney (talk) 02:39, 18 September 2012 (UTC)
I think that using a tracking category is probably the best method (if any). I don't see the need for a huge template at the top of every article alerting the reader that the article was created by a bot. LegoKontribsTalkM 05:24, 18 September 2012 (UTC)
I have no problem with existing approval being revoked while this is settled. Does not seem necessary to me, but should be done if BAG deems it necessary. This is not a sufficient location as this may be a community board, but its title is "Bot owners' noticeboard," so let's not pretend it is community wide in scope. 68.107.140.60 (talk) 01:12, 19 September 2012 (UTC)
I would like a permanent tag, not removed by human editor, could be category, or template, made by bot, what bot, what source. 68.107.140.60 (talk) 01:18, 19 September 2012 (UTC)
Just curious, 68.107.140.60, what value would be provided in being able to determine if an article was created by a bot or not? Thanks! GoingBatty (talk) 02:02, 18 September 2012 (UTC)
Easy for readers to see. So when someone not a community insider runs across 5000 pieces of crap that need deleted it is faster to see the source of the problem, and read the comments above by other editors. And, any reason why not? It's anyone, not anybot can edit. 68.107.140.60 (talk) 01:18, 19 September 2012 (UTC)
It is already easy for the community to see this by looking at who the first editor of the page was. Once a page is reviewed by a human there is absolutely no need to know that it was a bot that created it since it has been human checked. All it would do is give the false sense that the page is crap just because it was created by a bot. -DJSasso (talk) 18:31, 19 September 2012 (UTC)
No, in the case of bot-created content, the non-editor should have access to this information without actually seeking it. The tags on the article pages should be for readers. Editor/readers above raise concerns about bot-created articles, and scouring databases for information is different from the type of articles that Wikipedia aspires to. In addition, the copyright issues might be different. Many other issues. It amounts to this: the articles was not created by a human editor, but by a bot; that information should be transparently available. There is no reason for it not to be. 68.107.140.60 (talk) 14:06, 20 September 2012 (UTC)
Don't agree, I think there are reasons why you wouldn't. And the reasons you list for doing it don't seem all that compelling. Ruining the quality of the article by throwing up a notice on it saying it was bot generated even if it no longer resembles the article that the bot created would be a ridiculous action to take. As for copyright there would be no difference as a bot account is just another account of the bot operator so attribution would fall to them. -DJSasso (talk) 23:10, 20 September 2012 (UTC)
Why would the knowledge that the article had been created by a bot ruin the quality of an article? Maybe, if knowing a bot created it is sufficient to ruin the article's quality, we should not be using bots to make stubs. 68.107.140.60 (talk) 04:58, 21 September 2012 (UTC)
All banners ruin the quality of an article, not just those that would be created by a bot. If there is no sufficient reason to use one then we shouldn't. I don't see any good reason why we should. -DJSasso (talk) 16:25, 21 September 2012 (UTC)

Well, you've caught my curiosity. To my knowledge, I've never seen an article that was created by a bot, though it's easy to find bot edits to articles created by humans. Please post some links to some typical bot-created articles so I can see what the discussion is all about. And I mean actual articles, not pages like Wikipedia:Requested moves/Current discussions that the bot I operate writes. Thanks, Wbm1058 (talk) 18:28, 19 September 2012 (UTC)

answering my own question: Pages created by Ganeshbot—very impressive! So it appears that all the data is coming from creative-commons websites like World Register of Marine Species (marinespecies.org). Looks like these are better quality stubs than a lot of the stuff we get that's produced by humans   Wbm1058 (talk) 19:57, 19 September 2012 (UTC)
Need admin access to see the deleted articles. Yes, bot-created stubs, particularly taxa, are very useful, as the standard is that all valid taxa are notable. There are similar databases for other organisms, and if this is done properly by Ganeshbot, it may change community opinion about bot-generated taxon articles and lead to the same for other areas, plants, algae done properly, arthropods, etc. It would take Wikipedia editors 100+ years to create all these stubs that provide the only information on the web for some genera and species, including the currently accepted, by at least one authority, taxonomies. 68.107.140.60 (talk) 14:06, 20 September 2012 (UTC)
The problem is that while one bot might do a decent job, others do very poor ones. If I recall correctly Dr. Blowfeld was responsible for a lot of deeply flawed articles on villages in Afghanistan, or something like that. There was no reviewing body, heck there was no form of quality control, and the ensuing uncovering of problems and subsequent community yelling session did little to alleviate the problem. I'm personally in favor of a complete ban on articles created by bot or script. That being said, if someone wants to generate the articles in the userspace using a bot, then check each one and move them over once they've been cleared, that's a passable alternative. The argument that this creates a lot more work does nothing to sway me, as that 'lot more work' is the basic quality assurace that is expected of all human editors all the time. Since said ban isn't likely, at the very least having a tag that says 'this article was created by a bot, it hasn't be checked by a human yet' needs to be put in place. Sven Manguard Wha? 05:20, 21 September 2012 (UTC)
I would still like bot-created articles to have such a tag on their pages. The information was scrubbed from a data base by a bot, not a human, and I would like that tag to remain to identify it as such. I think that ultimatley large numbers of bot-created articles will be on Wikipedia. Doing it in a transparent way makes it easier for the human editors to identify such articles and cocntribute to their quality. -68.107.140.60 (talk) 05:34, 21 September 2012 (UTC)
Yes, quality takes time; I don't buy the lame arguments, either. -68.107.140.60 (talk) 05:44, 21 September 2012 (UTC)
It makes sense to me that bot-created articles should have an additional level of scrutiny. Surely any bot operator capable of scrubbing a third-party database to create articles en masse could easily write them to the bot's user space, then post a notice to a noticeboard similar to WP:RM, requesting permission to use a bot to move from the bot's user space to article space, en masse. Then any interested editors could take up to seven days, more or less, to review the articles in bot user space, and vote them up or down. This could be a separate process from approval of the bot itself. I don't favor tagging the articles with a visible bot-created notice, but putting them in a hidden, administrative category:Bot-created articles would be fine. There could be sub-categories within that category for specific bot's articles – Wbm1058 (talk) 15:37, 21 September 2012 (UTC)

I have temporarily shut down my bot due to internal malfunctions. I was worried the malfunctions can beginning manifesting themselves onto Wikipedia.—cyberpower ChatOffline(Now using HTML5) 12:39, 23 September 2012 (UTC)

Looking for a server?

Amidst all the Toolserver/Labs turmoil, I acquired a 2008 Mac mini, which I then installed Linux upon. It now runs fairly speedily 24/7 (assuming there isn't a power outage), and, as I have no real use for it, I am offering the use of it (just send me scripts) up for grabs to any disgruntled bot operator(s) who'd like me to run their programs on it. Just an FYI - please drop me a message if you're interested - I can send you system details/whatever else you'd like. Just trying to do my part! Theopolisme 07:40, 30 September 2012 (UTC)

ClueBot III down?

I noticed that some closed threads at NFCR haven't been archived yet. Looking at Special:Contributions/ClueBot_III shows that it has made no edits since 7 October. Will the bot be revived? -- Toshio Yamaguchi (tlkctb) 07:14, 13 October 2012 (UTC)

The bot seems to be running again. PleaseStand (talk) 14:43, 13 October 2012 (UTC)

Yobot edit summaries too inconsistent with actual edits

I have an issue with Yobot. In particular, the bot is making too many changes at once and often with edit summaries that are not specific enough or simply tangential to the actual edits made. My initial comment on the matter started with this conversation which continued here. I think User:Magioladitis has put little weight on my comments, has not tried very hard to understand my point, and has been slow to respond, so I bring the matter here. After our exchange, my main contention is that Yobot's edit summaries are not specific enough. This makes checking its changes difficult and annoying for watchers. In particular, Yobot can sort ref tags while still making other minor changes. This can change large blocks of text in a single article while the edit summary might only refer to some minor edit sprinkled in among those changes. Jason Quinn (talk) 16:43, 15 October 2012 (UTC)

I would have to agree with this. I have had numerous issues with this bot. And this is one of the biggest problems. The summary rarely matches the task very well. -DJSasso (talk) 16:44, 15 October 2012 (UTC)

Comment I would additionally add that it's good advice that bots should do one thing and do it well and that Yobot seems to tackle too many tasks at a time. Jason Quinn (talk) 16:52, 15 October 2012 (UTC)

So just to clarify I am understanding this correctly. You are upset because Yobot is doing too many things at once in one edit, making it harder to pick out individual changes in the edit summary differences? Which is funny because a few weeks ago I saw another editor complain that a bot edited an article and then edited it agin later so that it could describe each difference and that was a problem because it wasn't making the edits all in one pass! Either we do all the edits possible in one pass or we break them up, which I am sure that Magio explained can be done at least once in the past, but which also annoys some editors because it causes multiple edits to be made to the same article. We cannot ha both so someone needs to decide either we live with the bot minimizing the number of edits by maximizing that edit or we break it up and do multiple edits but with more descriptive summaries but which might and probably will greatly increase the chances and frequency of insignificant edits being done by said bot. Kumioko (talk) 17:02, 15 October 2012 (UTC)
Upon further investigation of the situation provided and based on the article and edit in question I submit the following:
  • Below line 147 Removed a space before Traditional
  • Below line 162 Removed a Space after . and before < at the end of the paragraph
  • Below that moved the . before the ref
  • Below line 232 replaced Further Reading with Further reading
  • As well as cleaning up references
So it seems that the bot did indeed do the changes that it mentions in the edit summary.
I also notice that according to this link CBM blindly reverted the entire edit as he is prone to do and then 4 more IMO wasted edits were required by You and by Magioladitis to cleanup what the bot did and CBM blindly reverted. I'm sorry but I see this as a pretty pointless and baseless issue and I think that CBM wasted time and system resources yet again with his pointless and needless reversions. Kumioko (talk) 17:57, 15 October 2012 (UTC)
As my edit summary indicates, I did not blindly revert - I preserved the one thing that the bot claimed it was doing, which was to move a period around a reference; see this combined diff of the bot's edit and my edit [4]. But I was not the one to open this thread, nor did I comment in it until now. I generally agree that a bot should do one thing at a time, well, rather than numerous things at once, badly. — Carl (CBM · talk) 19:19, 15 October 2012 (UTC)
I don't think the bot is doing anything badly. What I think the bot is doing is maximizing the edit by performing multiple fixes to the articles at a time, while providing the best description it can and still be reasonably short. Those edits being consensus approved BTW so if you do not agree with them, you should discuss them instead of the blind reverting. Also, I need to find the link but I am quite certain that it was you in at least one instance who complained that a bot, Smackbot if I recall, was annoying you because it did one edit after another to an article, because Rich had to change it because other editors complained it was doing too much at once. So again, we need to come to some understanding of what a bot is expected to do. Either it maximizes the edit while its there or it makes multiple repetetive changes to an article to make the edit history easier (stupid and a waste of resources in my opinion) which will also greatly increase the chances that minor edits will be done alone. We cannot do both at the same time. I also think that Magio has bent over backwards to cater to a lot of nonsense complaints about this or that. As it is Yobot is one of the only general edit bots left and if it goes, then there are going to be a lot of articles in bad shape because I know that YOU are not going to step up and do anything to fix them. With that said thugh, You seem to always have a better way or a better idea CBM, fine put your brain and time where your mouth is and write some better code to clean up the articles and we can use that bot in addition too or instead of, Yobot. But since I know that isn't going to happen, lets try and be gentlemen about this and not blindly revert edits made by others without discussion because you don't agree with them and don't want to take the time to discuss them. Kumioko (talk) 19:51, 15 October 2012 (UTC)
Reply:@Kumioko and @Magioladitis. I would not use the wording you used. Picking out "individual" edits is not what I am suggesting. I would say that Yobot should only tackle sensible classes of things together and once along with a common sense edit summary that clearly and concisely explains them. In particular, I have now repeatedly stated — and it's been repeatedly ignored — that switching of references ought not to be done together with minor changes. The difficulty this presents to editors to interpret the edit diff presents an obstacle to following the edit history. If nothing else, please forget every single issue except this one. The edit summary is clearly inadequate for this. In fact, WP:REFPUNC doesn't even say that references should be reordered. On top of that, the other stated guideline WP:PAIC is just a shortcut to WP:REFPUNC. Why is a redundant link being used in the edit summary? There's clearly a problem here. If a bot is going to make thousands upon thousands of edits, they must be run by detail orientated maintainers. The attitude that the bot is working just fine here and there's no problem at all feels obtuse and obstinate. At the least, when Yobot reorders references the edit summary needs to include "refs reordered". Better would be if reordering references were a separate task of the bot with an edit summary like "Sorting article references according to [THE ACTUAL POLICY THAT SAYS TO SORT ARTICLE REFERENCES (IF IT EXISTS)]". PS if you post a link related to the anecdote you wrote, I'll read it but for all I know, the person's complaints are consistent with my point of view, which is that tasks should be condensed... so long as they shouldn't. Jason Quinn (talk) 19:51, 15 October 2012 (UTC)
Thank you Jason I too believe that some of that is fair and I believe Magio will too. The problem with the reordering of references as a separate task, as well as some other things is that CBM and several others have ensured that these cannot be done alone. So, although I agree that more work can be done to the edit summaries and I am sure that Magio will try and reciprocate separating the tasks simply isn't possible if they are too be done at all. Additionally, some of these tasks such as reordering the referencs are built into AWB and are not "Yobot" specific so even if yobot didn't do them the chances are high that someone else using AWB would, thereby making additional edits to the article unnecessarily. Thank you for the thoughful and understanding response and please excuse my rant at CBM above. We annoy each other with our diametrically opposing views.:-)Kumioko (talk) 19:57, 15 October 2012 (UTC)

My current edit summary is of the following form: "[[WP:CHECKWIKI]] error #xx fix and [[WP:GENFIXES|general fixes]]" which is followed by a short text that explains which rule I apply for the specific CHECKWIKI fix. The remaining edits are explained in WP:GENFIXES. Last week I worked mainly in the direction of creating skip conditions for each CHECKWIKI error fix. Instead of loading all lists together I now work on each list separately. I am open in ideas of making my edit summaries smarter without having to generate a very lengthy edit summary. -- Magioladitis (talk) 18:41, 15 October 2012 (UTC)

Reply: See above. Jason Quinn (talk) 19:51, 15 October 2012 (UTC)
Hi Jason! I agree with the spirit of Magioladitis' editing and edit summaries: fix one specific problem in each article, and as long as he's editing it, make other minor improvements. He's starting with a list of all articles with the specific problem, so his edit summary is very specific about that fix. However, he can't anticipate which of the dozens of general fixes are going to be done in each article. Therefore, he provides a link to WP:GENFIXES which includes a detailed list of those fixes and the corresponding MOS guidelines.
If we were to create a separate bot task for each of the dozens of general fixes, our watchlists would be overflowing with lots of little edits, and we'd be spending more time reviewing them than actually improving the encyclopedia.
I suggest a slight variation to the edit summary format: "[[WP:CHECKWIKI]] error #xx fix (short text that explains the rule used to fix the error) and [[WP:GENFIXES|general fixes]]" (i.e. putting general fixes at the end of the edit summary) so it's clearer that the text applies to the Checkwiki error and not to genfixes.
Happy editing! GoingBatty (talk) 02:46, 16 October 2012 (UTC)
@GoingBatty. For reasons I feel I have already explained sufficiently I believe your stance on the edit summary is untenable. Where does the reference sorting fall in your opinion? As the "specific problem" or as "other minor fixes"? If it's the "specific problem", the edit summary completely failed to acknowledge what was being done and even the linked WP:REFPUNCT fails to discuss reference sorting. If it was "other minor fixes", how is it that a "minor fix" constitutes the bulk of the changes and whose diff text dwarfs the main "specific problem"? Neither idea seems to be very good. Whenever a "misc" list of edits are going to be done that aren't worthy of a specific mention in the edit summary, it should be covered by a link to WP:GENFIXES. On that point, I think we can agree. Lastly, again I state that my position is not to require bots to fix only one thing (although I am also not conceding that this is a bad idea). It is perfectly fine to have a bot do many things at once so long as it makes sense and that a reasonable edit summary can be provided. In particular, because of the difficulty finding and checking "other 'minor' improvements" in conjunction with a change like ref sorting that produces huge blocks of diff text, it does not make sense to have a bot include it as part of normal work flow. Jason Quinn (talk) 04:33, 16 October 2012 (UTC)
The reference sorting is one of the "other minor fixes" covered by the link to WP:GENFIXES - see the ReorderReferences section of the page. GoingBatty (talk) 16:56, 16 October 2012 (UTC)
I agree with what GoingBatty said, if you're going to make an edit, try and get the most out of it. After my last BRFA, I've tried doing this with more of my bot tasks now, doing simple things that AWB bots do by default (expanding template redirects, dating templates, other gen fixes) in Python. If bots can do simple uncontroversial tasks while editing, it's not a big deal if their edit summary isn't perfect. Yobot's edit summaries link to a page explaining what the bot is doing in more detail which is a good compromise. LegoKontribsTalkM 03:18, 16 October 2012 (UTC)
@Legoktm. I never said the edit summary has to be "perfect" so please don't use overly strong words that try to undermine my argument. An edit summary should however be good, and it should especially not be misleading. In this case, the ref sorting the bot was doing was something that was not explained by the links. It's not even clear to me that it is supported by the manual of style. So the bot wasn't linking to anything explaining what the bot was doing. As for doing multiple things at once, that's fine. I never said a bot has to do only a single task (although I'm not rejecting that idea either). What I am saying is that a bot shouldn't do "too much" at once. Ref sorting can change huge blocks of text and bury other edits. This makes checking the bot's work difficult. I don't like the idea that people are just supposed to trust bots. I especially don't like the idea of discouraging people from inspecting bots' performance by allowing or tolerating confusing edit summaries and difficult to follow diffs. Jason Quinn (talk) 04:33, 16 October 2012 (UTC)
I guess we could (try to) add something in the edit summary when reforder occurs. The current trend was on the opposite direction though because adding to edit summary when a specific function is called is costly. Currently we do this only with tags. I am afraid that this could end to add everything in the edit summary and make it unreadable or cause overflow. -- Magioladitis (talk) 23:18, 16 October 2012 (UTC)
I'm also concerned that changing AWB to specify each general fix done in an edit will overflow the edit summary. What did you think of my edit summary suggestion above? I hope Jason will also look at Special:Contributions/Yobot and suggest an edit summary that would be general enough for a group of edits. Speaking of the bot's contributions, the recent edit summaries are not linking to WP:GENFIXES. Thanks! GoingBatty (talk) 23:56, 16 October 2012 (UTC)
Arguing for AWB to specify "each general fix done" is a much stronger position than I am taking. I am not against the idea of lumping some fixes together as "general fixes". And I think supplying a link to WP:GENFIXES would be an improvement in some cases, as I already mentioned above. I think we find some agreement here. (Actually I prefer WP:AWB/GENFIXES for clarity purposes.) What constitutes a general fix then becomes the issue, and if certain things ought to be classified as general fixes. I have made my case that ref sorting is a problem if swept into the bag of "general fixes". It is not "minor enough" and deserves to be mentioned explicitly in the edit summary. This does not prevent a link to WP:AWB/GENFIXES if there were other edits. I'll take a look at Special:Contributions/Yobot for more ideas as I get a chance. Jason Quinn (talk) 03:18, 17 October 2012 (UTC)
Could you please say why ref sorting is "major"? Personally, I think swapping the position of two refs so they are listed in numerical order is pretty minor. No information changed. Both refs stay in the same spot at the end of the punctuation. I think one person's major fix is another's minor fix, so I don't see any point in arguing over that. Of course all my wife's complaints are major and all of mine are minor. :)
I think what happened is the same alot of other editors experience (and Quinn stated above). You look at the edit and go holy $*&@# mother *($*#). Why did all this change? Nobody knows all the rules and new editors are coming along all the time, so this is normal reaction. So, a better edit summary seems the only option. How about "Other fixes may have done that are listed at WP:AWB/GENFIXES. If you have question on why an edit was made, please contact <enter name>." Bgwhite (talk) 06:15, 17 October 2012 (UTC)
Remember that the edit summary is limited to 250 characters. Your suggestion is already 141 characters, which only leaves 109 to explain the primary purpose of the edit. While I don't object to adding a couple extra words to explain genfixes, it seems redundant to have to explain who to contact. Another option is to provide an explanation at the bot's user page, as I've done at User:BattyBot. GoingBatty (talk) 00:02, 18 October 2012 (UTC)
I could change my edit summary to fit your suggestion above. In User:Yobot I focus more on my edits on the talk pages. I also have a list for all my tasks with links. I could add some things on general fixes too. Any help is welcome. -- Magioladitis (talk) 11:45, 18 October 2012 (UTC)

User:Redrose64 running unapproved bot(s)?

Sorry if I'm about to defame a hardworking user, and an administrator at that, but I noticed something a bit strange on my watchlist today - two identical edits by User:Redrose64 on completely unrelated pages on a fairly esoteric technical point - here and here. Looking at the users edit history reveals large numbers of edits to various pages identical to those I encountered. Looking further back there are similar instances of these editing patterns on various different technical points, sometimes at a rate of several per minute and accounting for hundreds of identically described edits. I can make some fast edits sometimes, and I'm not averse to hard work, but nothing approaching this...

For all I know this may be normal AWB activity, I really am no expert. I'm sure this is a hard working administrator doing a good job - but it just seemed a little odd so I thought I'd flag this up to some people who are more knowledgeable than I am! MatthewHaywood (talk) 00:14, 17 October 2012 (UTC)

Hi Matthew! Looking back on the recent entries in Special:Contributions/Redrose64, I see different edit summaries, but sometimes in related groups, as if RedRose64 is cleaning out some maintenance categories. What makes you think RedRose64 is using AWB for this? Have you tried discussing these edits with RedRose64? Thanks! GoingBatty (talk) 00:28, 17 October 2012 (UTC)
It was just the surprising rate (several edits on different articles per minute sometimes), number and uniformity of the edits which seemed to be different to 'normal' editing patterns which I had experienced in the past. I certainly couldn't achieve this at the same speed. I didn't really want to confront the user before getting a second opinion here, and since you seem to think it's normal I might just let it drop. MatthewHaywood (talk) 00:37, 17 October 2012 (UTC)
Several times a minute? Examples please. I use no bots or scripts (I prefer to check my work before saving). It's very simple. I check Category:Pages containing cite templates with deprecated parameters two or three times a month and fixup the article pages in there. The most common issues that I fix are the use of |day=, |accessed= and |access-date=. --Redrose64 (talk) 10:43, 17 October 2012 (UTC)
My mistake. I didn't realise there was a category for pages with deprecated cite templates, this must be how you achieve the edit rate manually. I assumed only a bot could locate and rectify these issues at the speed you do. You're clearly just a highly efficient editor! Sorry to bother you. MatthewHaywood (talk) 13:05, 17 October 2012 (UTC)
Just for reference, Redrose64's edit speed and quantity is well below the rates of many other editors, who oftentimes don't even use AWB or scripts. —  HELLKNOWZ  ▎TALK 13:56, 17 October 2012 (UTC)
Also just for reference, there are many scripts and tools that allow users to check their work before saving. GoingBatty (talk) 01:28, 18 October 2012 (UTC)
I've done six more. These were all found manually; I'm notifying the Project because I don't want to end up like Rich Farmbrough. --Redrose64 (talk) 20:22, 22 October 2012 (UTC)
If this is such a common error, and all (or most) the errors are like the ones you are fixing, it should be pretty easy to send a bot around and fix them. (So you don't have to do them manually). Legoktm (talk) 20:25, 22 October 2012 (UTC)

WP 1.0 bot - new maintainer needed

I am going to step down as the maintainer of the WP 1.0 bot at the end of November. A new maintainer for the bot is needed. More information can be found here. — Carl (CBM · talk) 17:10, 28 October 2012 (UTC)

Rogue bot?

Edits by User:Hammocks and Honey that say db-a9|bot=ExampleBot and put the A9 csd incorrectly on articles where the artist is bluelinked. I'm reverting from the earliest on the list, but I can't see hoe to stop ot. I've messaged Hammocks and Honey. Peridon (talk) 17:41, 8 November 2012 (UTC)

Blocked by Nyttend, and mass-rollbacked by Reaper Eternal. Legoktm (talk) 17:51, 8 November 2012 (UTC)

RMCD bot

The RM bot, User:RMCD bot aka User:RM bot has stopped running. Can it be restarted or is yet another fork needed? Apteva (talk) 16:12, 25 November 2012 (UTC)

  BRFA filed I've forked it until the real bot starts working again, per the request at Wikipedia:Bot requests/Archive 50#RMCD bot. — Wolfgang42 (talk) 16:56, 25 November 2012 (UTC)

BRFA for duplication of existing task on a different page

Hi all,

A few months ago SuggestBot got approval to update the Community Portal's list of open tasks (Wikipedia:Community portal/Opentask, BRFA Wikipedia:Bots/Requests for approval/SuggestBot 7)). I'm now interested in having it update a smaller list of tasks (Template:Opentask-short) for experiments done in the Onboarding new Wikipedians project.

Would it be necessary to do a separate BRFA for this? Maybe it would instead be possible to refer to the previous BRFA and that the bot serves the same purpose? Would appreciate some input on this. Cheers, Nettrom (talk) 18:38, 20 November 2012 (UTC)

I think you should request another BRFA, unless you do the edits in your or your bot's userspace.  Hazard-SJ  ✈  02:45, 25 November 2012 (UTC)
That was my thinking as well, and I also think it's useful to have the paper trail just in case. Will get the BRFA submitted. Thanks for your input! Nettrom (talk) 18:28, 26 November 2012 (UTC)

Tip of the day

Automating tasks on Wikipedia

Uploading hundreds of files or changing thousands of pages can be tedious. We allow limited automation unless it interferes with normal systems operations. You always can grab your favorite scripting language and write a bot, but there's no need to reinvent the wheel: take a look at PyWikipediaBot, a quite complex automation framework for Wikipedia. If you are more into Perl, libwww-perl is a very useful library for automating web tasks. If you have tested your bot and intend to run it over a longer period of time, please get in touch with the developers first (preferably using the wikitech-l mailing list) or by requesting a flag here. We then can register your bot, so it can be hidden from the list of recent changes.

---cut here---

The above is being put out as the "tipof the day" it looks like it was written in 2005. Somebotty might like to find and update the tip. Rich Farmbrough, 02:59, 16 December 2012 (UTC).

Oh god. I'll take a stab at it. Legoktm (talk) 03:00, 16 December 2012 (UTC)
I created Wikipedia:Tip of the day/December 16, 2012, however that can probably be majorly re-written too. Legoktm (talk) 03:08, 16 December 2012 (UTC)

Threads of interest on the mediawiki-api mailing list

There are some threads that may be of interest to bot operators on the mediawiki-api mailing list. In summary:

  • A proposal to change how query-continue works, most extremely from the current style to returning a single value that must be appended to the original query.
  • A proposal to add versioning to the API, with a discussion of pros and cons.

Feel free to join the mailing list and the discussions (you can sign up for gmail or another free email service if you don't want to reveal your personal address), or I'll try to summarize replies posted here at some point. Anomie 14:25, 22 December 2012 (UTC)

Query-continue proposal

Personally, while I think the current system could be cleaned up somewhat, I don't much care for removing all option from the client in how continuation is processed. Anomie 14:25, 22 December 2012 (UTC)

This proposal only seems to relate to situations where the API gives multiple continue parameters. Personally, I think the version suggested by Yuri is the way it should have always worked - a bot should view all the continuation parameters as opaque, rather than trying to interpret what they mean. The current system seems overly complex. (Also, for those who don't know, Yuri was the person who wrote query.php originally, and I'm glad to see him discussing api.php now.) — Carl (CBM · talk) 16:37, 22 December 2012 (UTC)

API versioning

Personally, I'm particularly interested in pros and cons for introducing versioning at all, which the original proposal seems to have assumed as a given. Anomie 14:25, 22 December 2012 (UTC)

If the plan is to change interfaces greatly, I would much prefer a versioned system than a system where my existing libraries will suddenly break and I have to reimplement them under a tight deadline. The latter is happened over and over to me as the API was changed (most dramatically, in the edit and login methods). — Carl (CBM · talk) 16:44, 22 December 2012 (UTC)

Hi; could somebody who understands these things please check which bots are actually fixing double redirects at the moment? The only one I have personally recently witnessed working is AvocatoBot (task list · contribs) – but, in any case, could someone do a more thorough/scientific check and update the relevant pages as appropriate? Thanks   It Is Me Here t / c 14:23, 14 December 2012 (UTC)

User:Xqbot is also running. It probably would be much easier if WP:Bots/Status was still up-to-date. Legoktm (talk) 14:25, 14 December 2012 (UTC)
Or someone could probably scan edit summaries for anything in the form of the default PyWikipedia summary for that task from Toolserver (limiting the time, if possible).  Hazard-SJ  ✈  03:24, 16 December 2012 (UTC)
Regarding WP:Bots/Status, I was keeping on top of it but have slipped into forgetting. I will have a large purge and clean-out of it later today. Remove the inactives, add newly approved. Rcsprinter (articulate) No, I'm Santa Claus! @ 10:45, 23 December 2012 (UTC)

Scsbot not dating help pages

When I discovered the problem, I inserted dates myself. Someone else is doing it as well. I can't be here every day.— Vchimpanzee · talk · contributions · 15:39, 26 December 2012 (UTC)

Also, User talk:Ummit has had no responses since December 21.— Vchimpanzee · talk · contributions · 17:45, 26 December 2012 (UTC)
I did check Special:Contributions/Scsbot and the bot was inactive for five days but very active yesterday.— Vchimpanzee · talk · contributions · 17:48, 26 December 2012 (UTC)

List of test wikipedia of incubator

Is it possible make List of test wikipedia of incubator that renew by bot? Attempt in Russian Wiki: ru:Википедия:Список Википедий в инкубаторе Its talks: ru:Обсуждение Википедии:Список Википедий в инкубаторе,ru:Википедия:Форум ботоводов#Википедия:Список Википедий в инкубаторе --Kaiyr (talk) 14:53, 27 December 2012 (UTC)

I am currently applying to be a member of BAG and input is greatly appreciated.—cyberpower OfflineHappy 2013 13:31, 2 January 2013 (UTC)

Can someone point me to a simple Python bot's sourcecode?

Hi folks, I'd like to expand how I'm contributing to WP by trying to write some bots. The first idea is simple:

  • Scan through an article
  • Look for certain templates (for example, cite journal)
  • For those templates, look for and extract the value from one of the parameters (for example, pmid)
  • Go against an external site to retrieve some lookup values based on the template parameter value
  • Edit the article and add new parameters to the template with lookup results
  • Commit

I'm sure an experienced bot-writer could knock this out in an hour, but I want to do it myself. Can anybody point me to the source code of a simple Python-based bot that might be a useful shell for me to build such a bot? Appreciate it... Zad68 03:50, 3 January 2013 (UTC)

Not a python programmer, but point 3 will probably be your hardest one. Depending on how complex the template is, you may be able to get away with using regular expressions, however i'd recommend trying to use a parsing framework like MWParserFromHell --Chris 03:54, 3 January 2013 (UTC)
I live my life every day in Emacs so I'm pretty familiar with regular expressions. Thanks for the link... hope someone can point me to some sample source code. Cheers. Zad68 03:59, 3 January 2013 (UTC)
I suggest downloading the Python Framework (http://sourceforge.net/projects/pywikipediabot/). It contains many examples. Ganeshk (talk) 04:02, 3 January 2013 (UTC)
Documentation for the Python Wikipediabot Framework can be found here.  Hazard-SJ  ✈  04:11, 3 January 2013 (UTC)
Cheers thanks very much! I've created my 'bot' account, there's no problem with me writing some little tests and using that bot account to try things out, right? My understanding is that I only need to ask for approval when I want to really start using it? Zad68 04:46, 3 January 2013 (UTC)
No problem with running tests in your user space (For example, User:Zad68/sandbox). You will need bot approval for editing other namespaces. Ganeshk (talk) 04:51, 3 January 2013 (UTC)
Hi. I have a rather large collection of python scripts, all using my version of Pywikipedia's rewrite branch. This is probably the simplest script I have. This script fetches content from an external website (reddit's API) and updates a wikipage based on it.
This script is actually pretty close to what you're looking to do. It fetches info from an external website, parses it, updates certain template values based on the info, and saves the page. Legoktm (talk) 07:46, 3 January 2013 (UTC)
  • Excellent, thanks! Hope it wouldn't be a bother if I stop in here from time to time with questions. Zad68 04:08, 4 January 2013 (UTC)

Any comments would be appreciated ·Add§hore· Talk To Me! 16:16, 10 January 2013 (UTC)

API questions

Is there a way to get the edit count of an IP account? Although the mw:API:Users example includes an IP, it doesn't seem to work. NE Ent 22:27, 10 January 2013 (UTC)

Not using the API, since IPs aren't stored in the user table in the database. You can use X!'s tool for IPs though which I'm assuming just counts how many rows in the revision table the IP has. Legoktm (talk) 22:32, 10 January 2013 (UTC)

I am currently (self) nominated to become a member of BAG (Bot Approvals group). Any questions and input you may have is invited with open arms [[here. ·Add§hore· Talk To Me! 21:39, 16 January 2013 (UTC)

Bot block template

There is a discussion on the administrators' noticeboard regarding a new template designed for alerting bot operators that their bot has been blocked: Wikipedia:Administrators' noticeboard/Archive244#Blocking misbehaving bots. Feedback from bot operators is welcome. 28bytes (talk) 20:31, 18 January 2013 (UTC)

Text after reflinks

Moved to Wikipedia:Bot_requests#Text_after_reflinks. — Preceding unsigned comment added by Hellknowz (talkcontribs) 11:13, 20 January 2013 (UTC)

LinqToWiki: new library for accessing the API from .Net

I'd like to introduce LinqToWiki: a new library for accessing the MediaWiki API from .Net languages (e.g. C#). Its main advantage is that it knows the API and is strongly-typed, which means autocompletion works on API modules, module parameters and result properties and correctness is checked at compile time. Any comments are welcome. User<Svick>.Talk(); 17:56, 17 February 2013 (UTC)

Templates with too many parser functions

I wrote a script to search templates with many parser functions that are worth to convert to Lua now: mw:Special:Code/pywikipedia/11099. Bináris (talk) 06:17, 21 February 2013 (UTC)

Adminstats

Due to a bug, I have temporarily shut down adminstats.—cyberpower ChatOnline 13:38, 21 February 2013 (UTC)

Bot library or example of extracting and parsing refs and cites?

Does anybody have Python code to parse an article and extract all the refs, maybe even parse through a few of the more popular template:cites (cite book, cite journal, etc.)? I'd like to develop a number of little utilities that manipulate refs and cite info, but wanted to see if someone has already laid this sort of groundwork. Ideally it'd be a library function that took a page as input and returned a list of data structures containing the ref and cite info, bonus points if the library had an API to manipulate the data and apply it back to the page for writing. I scanned through a bunch of the existing code I was able to find but didn't come across anything. Any help appreciated, cheers! Zad68 15:07, 27 February 2013 (UTC)

If you're using Python, check out MWParserFromHell --Chris 15:16, 27 February 2013 (UTC)
Well now if that wasn't exactly what I was asking for! Poked around with it last night, looks very useful. Thanks! Zad68 14:06, 28 February 2013 (UTC)

Wikidata

As some of you may know, Wikidata interwiki links went live on the Hungarian Wikipedia today. Many editors started removing interwiki links en masse. However, it was soon realized that the interwiki bots were still running, and they started readding links. I assume that we want to prevent this from happening when Wikidata is turned on for the English Wikipedia... --Rschen7754 21:30, 14 January 2013 (UTC)

The ideal way would be to contact all interwiki bot operators and ask them to stop when its time to do so, however with the amount of bots running I doubt that it would be possible. Blocking bots would also work, but not necessarily the best idea if bots are doing other tasks. As a last resort, we could implement a filter to stop them. Legoktm (talk) 21:35, 14 January 2013 (UTC)
Also, if the interwiki links are still in page text, will they interfere with the wikidata links? We should probably have a bot that will go around and mass remove them (after verifying they exist on wikidata) rather than having editors manually do it. Legoktm (talk) 21:37, 14 January 2013 (UTC)
They won't interfere, from what I've seen. --Rschen7754 21:38, 14 January 2013 (UTC)
(ec)Ooof, this is going to be annoying. For collegiality purposes, we've approved interwiki bots fairly freely based on operator competence. We have at least 126 interwiki bots, and I suspect the number is larger. Once WikiData goes live, I think we should notify the 126 known operators and try to use some sort of RSS/recentchanges scanner to detect the others that slip through. MBisanz talk 21:39, 14 January 2013 (UTC)
I've also been told that there may be some update to pywikipedia, but I'm not familiar with the specifics. --Rschen7754 21:42, 14 January 2013 (UTC)

[5] goes into more detail. Speaking of which, if you have an interwiki bot that runs on the Hungarian Wikipedia, we would appreciate it if you changed your code... --Rschen7754 22:45, 14 January 2013 (UTC)

(edit conflict) Hi everybody, I wrote to pywiki developers to update interwiki bots. (I am a developer myself, but I have never worked with interwiki.py.) I don't think 126 bot owners should be notified; if a developer updates the code, they will be responsible to update their bots in a reasonable time. Let's see what happens. There are more phases: Hungarian Wikipedia will be followed by Hebrew and Italian in the second step and English in the next phase. At last the remnant. Link Fa templates will still be handled by interwiki bots for a while until Wikidata integrates them. It's easier to follow these happenings in pywiki code, I suppose. Cheers, Bináris (talk) 22:50, 14 January 2013 (UTC)

I don't trust all developers to update their code. A note reminding them to update their code might work. MBisanz talk 23:56, 14 January 2013 (UTC)
Also, see d:Wikidata:Project_chat/Archive/2012/12#Featured_articles_and_good_articles.  Hazard-SJ  ✈  03:50, 16 January 2013 (UTC)

A temporary solution might be to use the edit filter to block (disallow) either bot changes to interwikis or (and I would think this is more efficient) block bot edits with an interwiki.py edit summary. Of course, 1% of those edits are still probably going to be good ones; but in any case it would generate a handy list of current interwiki bots and give owners some breathing space if they forget to convert. - Jarry1250 [Deliberation needed] 09:55, 16 January 2013 (UTC)

Legoktm had coded a filter to get a list of current interwiki bots, but it was generating a lot of noise for those who monitor the filter feed, so he disabled it until we get closer to the date and have something useful to communicate to the bot opts (like "Stop adding interwikis. Update the .py extension and only add FA stars."). MBisanz talk 13:05, 16 January 2013 (UTC)
As MBisanz said, I created filter 524 but it was setting off User:Mr.Z-bot on IRC, so I disabled it.
I ran a query on the recentchanges table and came up with User:Legoktm/Interwiki bots which shows 71 active bots. Legoktm (talk) 17:29, 16 January 2013 (UTC)

Just another FYI since this wasn't brought up - the bots will still need to add the FA/GA stars when the article is FA/GA on other Wikipedias. Unfortunately that's not in Wikidata yet. --Rschen7754 10:03, 16 January 2013 (UTC)

Wikipedia:Wikidata interwiki RFC has been started. --Rschen7754 09:33, 17 January 2013 (UTC)

The date is February 11th: http://blog.wikimedia.de/2013/01/30/wikidata-coming-to-the-next-two-wikipedias/ --Rschen7754 20:11, 30 January 2013 (UTC)

So, Wikidata goes live on English Wikipedia in two days. How can we contact all the bot owners to let them know that their bots will no longer be needed to update the interwikilinks in the same way as they have done? It would be great to try for a nice clean transition where interwiki code can begin to slowly be removed from en.wiki. Bot owners should try and direct their bots towards organising interwikis on wikidata instead. Delsion23 (talk) 01:02, 9 February 2013 (UTC)

User:Legoktm/Interwiki_bots is a start, at least. —Theopolisme (talk) 01:28, 9 February 2013 (UTC)
I have a slightly larger list at User:Addshore/Sandbox which is merged with Legoktm's list as well as those bots listed at Wikipedia:Bots/Status/active_bots that say they are interwiki bots. ·Add§hore· Talk To Me! 18:11, 10 February 2013 (UTC)

And we're live now! --Rschen7754 21:15, 13 February 2013 (UTC)

Is there any way of contacting all the bot owners mentioned in the lists above? Delsion23 (talk) 22:26, 13 February 2013 (UTC)
If we draft a message to send to them I will stick it on their talk pages :) ·Add§hore· Talk To Me! 23:57, 13 February 2013 (UTC)
  • With pyrev:11073, all interwiki.py bots on enwp should be effectively disabled. Any pywikipedia bot that keeps editing has not updated their code, and probably should be blocked for not following Wikipedia:BOTPOL#Interwiki_links, specifically where they should be updated daily. Legoktm (talk) 05:42, 14 February 2013 (UTC)
  • Ok I've drafted a message at User:Legoktm/Wikidata. Please edit/copyedit/etc it. If there are no objections in 2 hours, I'll message it to all bot operators from Addshore's list. Legoktm (talk) 06:30, 14 February 2013 (UTC)
    • Hope they all understand English... --Rschen7754 06:34, 14 February 2013 (UTC)
      The problem is more that some of them prefer to get a message in their homewikis. For instance, the operator of User:Rubinbot, who still continues adding interwiki and reverted my earlier edit, speaks English but needs to be contacted on Russian Wikipedia, where I have no account.--Ymblanter (talk) 07:05, 14 February 2013 (UTC)
      I was planning to deliver them manually, and normally those bot operators have indicated to contact them on their home wiki. Legoktm (talk) 07:06, 14 February 2013 (UTC)
      Then we should be fine.--Ymblanter (talk) 07:42, 14 February 2013 (UTC)
  • I'm starting now, will post a list once I'm done. Legoktm (talk) 09:14, 14 February 2013 (UTC)
    •   Done. A list of which wikis I left the notes on is here. Legoktm (talk) 10:47, 14 February 2013 (UTC)
Does the new rule to not change any langlinks anymore also affect semi automated langlink bots? There are still many different reasons why langlinks still need to be updated (static redirects, anchor langlinks, wrong local langlink, ...) Merlissimo 10:55, 14 February 2013 (UTC)
If the language links can be updated on Wikidata (and the local langlinks removed), that would be preferable. But if local langlinks are necessary for some reason, then it is ok to update them. Anomie 12:55, 14 February 2013 (UTC)
  • Rubin16, operator of RubinBot, about 1 week ago wrote, that now travelling and temporary haven't access to the bot console. May be temporary make personal deny-filter for this bot if through 2 days don't stop? To Yaroslav: write to me, if any for russian. With regards. — Jack 14:14, 14 February 2013 (UTC)
    Thanks, I will have this in mind.--Ymblanter (talk) 15:27, 14 February 2013 (UTC)

I notice that Rubinbot is still editing, that its bot approval request lists it as being based on pywikipedia, and Legoktm's argument that pywikipedia-based bots that don't update themselves and continue re-adding interwiki links should be blocked because of the listed policy. This is not an area I'm experienced in. Is my understanding correct here?, and if so, is blocking here premature? No criticism intended of the author, I just suspect that at some point people will want to stop seeing back-and-forth interwiki adding/removing. (Don't panic, I have absolutely no intention of taking action without some discussion.) --j⚛e deckertalk 18:42, 15 February 2013 (UTC)

Addshore already blocked and unblocked Rubinbot after the operator let him know that the bot has been disabled. IMO we should only block bots that are actively causing problems, like re-adding links that were removed because of Wikidata. Legoktm (talk) 18:48, 15 February 2013 (UTC)
And also some items are not on Wikidata, just today I created two from my watchlist here.--Ymblanter (talk) 18:53, 15 February 2013 (UTC)
Ahh, in fact, it did stop an hour or two back. My bad. --j⚛e deckertalk 18:57, 15 February 2013 (UTC)
I would say about half of the articles that have interwikis have wikidata entrys that are missing some of the interwikis that are on EN. User:Addbot/log/wikidata shows all of the interwikis that needed to be added after my bot ran over just 50 pages. ·Add§hore· Talk To Me! 20:36, 17 February 2013 (UTC)
I was just about to post that here, any comments are welcome as we have of course gone through pretty much the same as this but on a smaller scale! ·Add§hore· Talk To Me! 15:11, 7 March 2013 (UTC)

Test wiki

Can someone remind me where the test wiki is? I'm updating my bot framework and would like to try out a new function I just created.—cyberpower ChatOnline 03:29, 6 March 2013 (UTC)

test.wikipedia.org ...? Legoktm (talk) 03:35, 6 March 2013 (UTC)
  Facepalm. It's always the simple things I forget.—cyberpower ChatOffline 04:52, 6 March 2013 (UTC)
This question is a thing of beauty! ·Add§hore· Talk To Me! 16:36, 6 March 2013 (UTC)

PHP help

The simple testcase at User:Chartbot/simplified dies with an HTTP 517 error as it is, but works fine if I try to send 634 bytes instead of the 635 that the testcase uses. It seems that PHP is sending a Expect Continue 100 header when the request gets too long, and that's getting refused. Anyone know either how to get PHP to stop sending the header or get the Wikimedia server to not get cranky about it? Without me having to learn yet another PHP library? All the parts of the bot that I thought would be hard work, and it's frustrating to be stumped with my toe on the finish line.—Kww(talk) 06:30, 6 March 2013 (UTC)

Got a version working with Curl, which will let me suppress the "Expect" header.—Kww(talk) 16:21, 6 March 2013 (UTC)

How to read XML from URL so that I get < and > instead of &lt and &gt?

I'm getting started with a little bot coding. My question isn't necessarily even Wikipedia-specific, but here it is:

Given a particular identifier, a "PMID" (looks like "9736873") which identifies a journal article, I'm trying to go to the NIH's PubMed site and pull an XML file full of metadata about the article. A typical lookup URL looks like: http://www.ncbi.nlm.nih.gov/pubmed/9736873?report=xml&format=text. In my browser I get back something that looks like:

 <PubmedArticle>
     <MedlineCitation Owner="NLM" Status="MEDLINE">
         <PMID Version="1">9736873</PMID>
         <DateCreated>
             <Year>1998</Year>
             <Month>10</Month>
             <Day>01</Day>
         </DateCreated>
         <DateCompleted>
             <Year>1998</Year>

..etc. I want to use BeautifulSoup to parse the resulting XML and pull out particular fields. My code looks like:

 if param.name == 'pmid' and param.value:
   pmid = str(param.value).strip()
   url = "http://www.ncbi.nlm.nih.gov/pubmed/" + pmid + \
         "?report=xml&format=text"
   f = urllib.urlopen(url)
   xml = f.read()
   f.close()
   soup = BeautifulSoup(xml)

My problem is that the BeautifulSoup parse doesn't work because what I'm actually getting back from PubMed looks like:

 <?xml version="1.0" encoding="utf-8"?>
 <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
 < pre>
 &lt;PubmedArticle&gt;
     &lt;MedlineCitation Owner="NLM" Status="MEDLINE"&gt;
         &lt;PMID Version="1"&gt;9736873&lt;/PMID&gt;
         &lt;DateCreated&gt;
             &lt;Year&gt;1998&lt;/Year&gt;

so instead of < and > characters I'm getting &lt and &gt, and BeautifulSoup isn't parsing it. If I copy the text out of the browser and paste it into a file, and read from the file, it works.

I tried using the requests library instead of urllib and got the same result. What am I doing wrong? Cheers... Zad68 19:10, 7 March 2013 (UTC)

You're not really doing anything wrong, it's just that that site isn't actually returning XML, it's returning HTML. If you open up the page, right click and select "View Source", you'll see the real text, and that once again has the &lt and &gt etc. I suppose you could run it through a HTML parser first (something do do the same thing as this tool) but that's a bit messy. If you want to do it like that, see this forum thread (you'll want to decode all the text within the <pre> tag). - Kingpin13 (talk) 19:38, 7 March 2013 (UTC)
Thanks I will look at that. As it's going so far, I simply did this:

soup = BeautifulSoup(xml.replace('&lt;','<').\ replace('&gt;','>') )

and it's working. It just doesn't feel right! Zad68 19:41, 7 March 2013 (UTC)
Note you'll also need to replace '&quot;' with '"', '&apos;' with "'", and '&amp;' with '&' (do that last, for obvious reasons). And hopefully you don't run across any "&#...;" or "&#x...;" codes. Anomie 12:56, 8 March 2013 (UTC)
I tried xml = html_parser.unescape(xml_in_html) and that worked for a few but then bombed on something. Gee what a pain... I'm just going to go with my manual decoding! Thanks for the note that PubMed is "lying" about delivering XML, appreciate it. Zad68 19:56, 7 March 2013 (UTC)
If you provide details on the "something", we might be able to help you. A first random guess is that it might have been '&apos;', which is sometimes not supported in HTML. Anomie 12:56, 8 March 2013 (UTC)
I already had the "manual" unescape working when I tried using the HTMLparser. Once HTMLparser stack-traced I didn't pursue it further. But if we're game to look, here's the error:
 File "/usr/lib/python2.7/HTMLParser.py", line 472, in unescape
   return re.sub(r"&(#?[xX]?(?:[0-9a-fA-F]+|\w{1,8}));", replaceEntities, s)
 File "/usr/lib/python2.7/re.py", line 151, in sub
   return _compile(pattern, flags).sub(repl, string, count)
 UnicodeDecodeError: 'ascii' codec can't decode byte 0xc3 in position 1: ordinal not in range(128)
The PubMed result it is trying to process is from http://www.ncbi.nlm.nih.gov/pubmed/?term=3776457&report=xml&format=text. The result type is str. The specific part of that it is choking on is where it is trying to unescape the first Author LastName, "Mölsä" which has two vowels each with a diaeresis. This appears in the source returned from PubMed as M\xc3\xb6ls\xc3\xa4. If I pre-process what PubMed gives me and remove that last name with html_parser.unescape(xml_in_html.replace('M\xc3\xb6ls\xc3\xa4',' ')) it works. Any insight appreciated! Zad68 14:15, 8 March 2013 (UTC)
html_parser.unescape() is expecting ASCII text as input, so the UTF-8-encoded input is causing it to fail. Check the documentation for how to change the expected codec, or hope someone familiar with python chimes in. Anomie 12:30, 10 March 2013 (UTC)
string.decode('utf-8') should fix that. Legoktm (talk) 12:32, 10 March 2013 (UTC)
  Resolved
And it does! Perfect, thank you Legoktm and Anomie for the help! Zad68 02:01, 13 March 2013 (UTC)

To ALL bot owners

All botops should be aware that Wikipedia just experienced a server failure that affected all bots. If your bot just edited in a strange and unusual manner, it may not have been the bot's fault.—cyberpower ChatOnline 00:49, 13 March 2013 (UTC)

That's very vague. Anomie 01:35, 13 March 2013 (UTC)
Not really. What information is missing?—cyberpower ChatOnline 02:50, 13 March 2013 (UTC)
To be more specific, there was an issue with requests intermittently returning 404s to valid destinations, but only on https. (I think) Legoktm (talk) 02:52, 13 March 2013 (UTC)
It wasn't just https, otherwise, my bot wouldn't have failed. If you want specifics, the techies accidentally uninstalled php on a bunch of server as they were trying to update it. This obviously caused wikipedia to fail if you happened to hit a broken server, or one of your bots hit it. If the bot hit a broken server while requesting and hit a good one while editing, well...—cyberpower ChatOnline 03:02, 13 March 2013 (UTC)
If your bot hit a broken server while requesting and still went on to make an edit based on that missing data, your bot needs to have proper error handling added. Anomie 10:39, 13 March 2013 (UTC)
Actually, most of the time, my bot was stalled.—cyberpower ChatOnline 12:44, 13 March 2013 (UTC)
Even AnomieBot screwed up during this.cyberpower ChatOnline 12:49, 13 March 2013 (UTC)
As I said... ;) Thanks for letting me know. Anomie 15:00, 13 March 2013 (UTC)
But you do have a point. I am updating my framework, so I will work on creating an API Failure function to it.—cyberpower ChatOnline 12:51, 13 March 2013 (UTC)
Good call, Anomie, that bot owners should update their bot's error handling if the bot edited through this. Useful post, its vagueness is beside the point, as it is just a call to check your bot's edits. -166.137.191.25 (talk) 23:53, 13 March 2013 (UTC)

mwparserfromhell doesn't seem to recognize ref tags?

Hey folks, I'd like to modify my bot (coded in Python) to pull the Wikicode for an article and extract all the <ref>...</ref> definitions, put them in a list, and do whatever I want with them. First use will be to create a table of refs for me to review in doing GA or FA reviews. Bot already uses the mwparserfromhell libraries, and with it I can parse out templates nicely, but now I'm trying to use it to parse out ref tags and having trouble, they appear to be treated as text by the library:

  >>> import mwparserfromhell
  >>> text = "I has a template!<ref>{{foo|bar|baz|eggs=spam}}</ref> See it?"
  >>> wikicode = mwparserfromhell.parse(text)
  >>> wikicode.filter_templates()
  [u'{{foo|bar|baz|eggs=spam}}']
  >>> wikicode.filter_tags()
  []
  >>> wikicode.filter_text()
  [u'I has a template!<ref>', u'</ref> See it?']

Any idea what I'm doing wrong? Cheers... Zad68 15:33, 13 March 2013 (UTC)

Replied on my talk page if anyone is curious. — Earwig talk 21:50, 13 March 2013 (UTC)

Cyberbots I and II Update

It is imperative that any out of the norm edits as of now be reported to me. I am about load an update to the edit function of the framework into my bot which will affect all tasks. I will of course be watching myself, but I may miss something as I recode Peachy.—cyberpower ChatOnline 23:22, 14 March 2013 (UTC)

Functions in RFBA

Can we routinely ask users to post the function overview even if it is a request to take over another bot's task? There is no standard way that bot tasks are linked that I can understand, and if it is another bot's task, it takes forever for me to search for it, sometimes with no result.

I think it is straightforward to not create a trail of breadcrumbs/links and just include a sentence in the function overview about what the bot does.

Thanks, -68.99.89.234 (talk) 11:43, 10 March 2013 (UTC)

Is it really too much to ask that RFBAs include a brief function overview for editors who are not entrenched here? It is such a simple thing, instead of writing a wordy sentence that contains no information but to redirect you to another bot, without any links to the bot, just its name, and that bot's page doesn't contain the specific link, why not just describe in one sentence what the bot is doing? IF the bot task is so complex it cannot be described in one sentence, then it requires more than a link. If it can be described in a single sentence, that is the point of function overview. I am sure to hear arguments the usual campfire bs about there is no point to anything.
Thanks for the usual acknowledgement that anyone besides bot regulars does not belong commenting on bots. Shall I link to the endless discussions about lack of participation? -68.107.140.25 (talk) 18:40, 17 March 2013 (UTC)
Agree with your initial proposal - it should be easy enough to copy the information from the original bot and paste it into the new bot request, so each bot request can stand on its own. GoingBatty (talk) 20:56, 17 March 2013 (UTC)
I rather thought there was no participation because you were making a mountain out of a molehill. My request was administrative in nature, not a request for a new task, and was "shorthanded" accordingly. I agree with you that I should always be conscious of the fact that the BAG is not the only audience to requests and that I should better clarify my requests in the future. This one issue doesn't really merit a change in "routine" though. Thanks, — madman 15:36, 18 March 2013 (UTC)
Really? So, what, an RFBA wasn't needed? No, an RFBA was needed, and it would have taken you 1/10th or less of the time to simply write the function in the RFBA, instead of redirecting me to a page that doesn't even have the function listed. It's not that complex of a task, just write it out as a courtesy to user who are just running through the RFBAs so see if they want to comment. It should not merit a change in the routine, this should be the routine already. That's a great use of the brief summary, then a detailed outline of task, put a sentence in the former, send users on pointless empty goosechase links in the latter. -68.107.140.25 (talk) 03:40, 19 March 2013 (UTC)

AIV Helperbots

Howdy. After a discussion at ANI, I'm hoping that somebody here might be able to help. Over the last few weeks, HBC AIV helperbot5 (talk · contribs) and HBC AIV helperbot7 (talk · contribs) have been intermittently disappearing from WP:AIV for anywhere from a few hours to nearly a day. The bots appear to be functioning normally at WP:UAA - or at least more normally - so it's not as if the bots are just dead. Because AIV maintenance is a fairly important function and I'm not sure if the bots' operators are around at the moment, I was wondering if anybody here had any idea what the problem might be? Thanks. --Bongwarrior (talk) 01:33, 22 March 2013 (UTC)

Yetanotherbot's been in a trial for cloning these bots, but it was stopped at the end of its trial, so it hasn't been active. (X! · talk)  · @116  ·  01:47, 22 March 2013 (UTC)
I'm not sure if they'll disappear again, but right now it looks like they're working fine. --Bongwarrior (talk) 03:34, 24 March 2013 (UTC)

DASHBot blocked

FYI, DASHBot has been blocked as it is malfuncioning, Tim is aware. As soon as the code is fixed I'll unblock. Regards, GiantSnowman 20:05, 23 March 2013 (UTC)

New exceptions in Pywikipedia

From now on you may use two new exceptions in Pywikipedia (r11300). When replacing texts with replace.py/fixes, you may add to 'inside-tags' section of exceptions:

  • 'property' for Wikidata property inclusions
  • 'invoke' for module invocations (currently only Lua)

Cheers, Bináris (talk) 09:38, 30 March 2013 (UTC)

Cyberbot II blocked

See discussion at User talk:cyberpower678#Buggy bot, and feel free to jump in and help if you're familiar with the framework that cyberpower is using. ‑Scottywong| speak _ 22:29, 3 April 2013 (UTC)

Umm, why didn't you just disable the run page like Cyberpower told you to use? That seems like the first thing to do if a bot isn't working properly... Legoktm (talk) 23:19, 3 April 2013 (UTC)
Frankly, because I don't have sufficient confidence that the problem will be fixed when Cyberpower decides to restart the task on his own. ‑Scottywong| express _ 01:17, 4 April 2013 (UTC)
So now once Cyberpower finishes fixing his code, he now has to wait for you (or another admin who has "confidence" in him) to unblock it, before he can deploy it, except you won't have any idea whether the problem is actually fixed... Legoktm (talk) 02:33, 4 April 2013 (UTC)
I let him know about a problem with his bot, his response was essentially that he didn't think that it's a big problem, and he'll take care of it when he gets around to it. I told him I'd block the bot if it continued making erroneous edits. He claimed to have fixed the problem, but the bot continued to make errors, so I blocked it. I don't need to inspect his code in order to be satisfied that the problem is fixed, but I'd like to at least have a discussion about what the problem was, and how he fixed it. At this point, it seems like there are still holes in his algorithm (at least from the way he describes it). If his source code is public, perhaps someone who is familiar with the framework he uses can take a look at it and comment. ‑Scottywong| squeal _ 03:48, 4 April 2013 (UTC)

Citation Bot needs update

As described here, Citation bot (BRFA · contribs · actions log · block log · flag log · user rights) needs an update to be fully compatible with the new Lua-based citations when there are more than 9 authors or more than 4 editors. However, the bot's original author doesn't seem to have much time for it anymore [6]. He invites others to improve the improve the source though, and I was hoping that someone else might feel inclined to work on the needed improvements. Dragons flight (talk) 22:52, 3 April 2013 (UTC)

Cyberbots I and II Update

In addition to the core Framework being updated, and bug fixes being implemented on the currently shutdown tasks. Cyberbots I and II will be slowly migrating to labs. Services such as RfX reporter, tally, cratstats, and adminstats that many of you use may get cut for a bit. Also because I am unfamiliar with labs at the moment, I have absolutely no idea how adminstats will run on labs. If there are any errors or out of the norm edits in the coming future, please let me know.—cyberpower ChatOffline 15:17, 10 April 2013 (UTC)

Does Labs have SQL database replication yet? I haven't been keeping up on it. ‑Scottywong| confabulate _ 15:24, 10 April 2013 (UTC)
Like I said, I'm not familiar with labs yet, so I'm making a slow transition over to it as I get familiar with it. Since it's as good as confirmed that toolserver will be shutoff in the future, I might as well start with the transition.—cyberpower ChatOffline 15:28, 10 April 2013 (UTC)
I'll likely need to do the same at some point, but I probably won't start an account there until db replication is stable. ‑Scottywong| express _ 17:14, 10 April 2013 (UTC)
Labs doesn't have replication yet, there was an update from Coren on the Toolserver-l/Labs-l a few days ago. Last I heard the eta was by the hackathon. Legoktm (talk) 18:00, 10 April 2013 (UTC)

EdinBot

Could someone please check Special:Contributions/EdinBot - I think it may be operating on the wrong wiki. Thanks! GoingBatty (talk) 01:58, 15 April 2013 (UTC)

Seems like it should be running on bs.wikipedia.org because that is where it is getting the edit counts from (and where it is approved). Although, there is no page by the name "Wikipedia:Spisak korisnika po broju izmjena" on the wiki.. -- Cheers, Riley 02:06, 15 April 2013 (UTC)
Thanks for pointing me in the right direction - I've left the bot owner a note at bs:Razgovor sa korisnikom:Edinwiki#Edits on en.wikipedia. Thanks also to User:Soap for blocking the bot while this gets sorted out. GoingBatty (talk) 02:16, 15 April 2013 (UTC)
FYI, User:Edinwiki quickly replied to me at User talk:GoingBatty#EdinBot. GoingBatty (talk) 02:20, 15 April 2013 (UTC)

AIV helper bot editing logged out

See Special:Contributions/185.15.59.211. Of course, I have no idea which one it is. What can/should be done when a bot is editing logged out? Thanks. Someguy1221 (talk) 03:26, 20 April 2013 (UTC)

  • Per instructions on User:HBC_AIV_helperbot7, I have emailed the bot operator. Why not block temporarily Special:Contributions/185.15.59.211 (autoblock disabled, anonymous users only) until we get a reply from the bot operator? -- Cheers, Riley 03:33, 20 April 2013 (UTC)
  • (edit conflict)Well normally you block the IP and leave a note for the operator to fix it. But if it's the only helperbot running you might not want to block it... Legoktm (talk) 03:34, 20 April 2013 (UTC)
    • Operator's response: "[...]I believe it's helperbot5 that runs on the toolserver. It looks like it's logged itself back in since though, but worth keeping an eye on[...]" :) -- Cheers, Riley 08:20, 20 April 2013 (UTC)

Blocking toolserver IPs

We have an existing consensus to softblock the toolserver IP addresses, for this very reason. I guess from the above that they have new IP addresses now; that would be 185.15.59.192/27, judging by whois data? Which is part of 185.15.56.0/22 that is assigned to WMF. And then there's 2620:0:860::/46 in IPv6 assigned to WMF too, of which we currently have 2620:0:862:101::2:0/124 blocked as toolserver.

We should probably also do the same for Labs projects; it seems edits from Tool Labs will currently come from 10.64.0.126, and I see past edits from 10.64.0.123, 10.64.0.169, and 10.64.0.170, but the full range of possible IPs is not obvious (random guess: 10.64.0.123–170, plus a few others in 10.64.0.0/12, plus some in 2620:0:861:101:10::/80 if IPv6 happens to be used). OTOH, should logged-out edits be coming from anything in 10.0.0.0/8 at all? Anomie 03:35, 21 April 2013 (UTC)

Cyberbot

Cyberbot II is now running on labs. I am now migrating Cyberbot I.—cyberpower ChatOffline 18:47, 24 April 2013 (UTC)

Difficulties with TedderBot

Hello Bot owners. It seems that User:Tedder has recently had difficulty with TedderBot. The bot has been out of commission for nearly 2 months. This lack of service is a great loss for the staff of many of the various WIkiProjects, who usually reply on that bot's listings of new articles that are in their subject areas. Is it possible for other people to help Tedder in some way in order to get TedderBot up and running properly? Perhaps I should apologize for going behind Tedder's back in this way, but I believe my motivation is good. Many thanks, Invertzoo (talk) 00:57, 2 May 2013 (UTC)

One bot not archiving Wikipedia:Miscellany for deletion

I left a discussion about this on the bot owner's talk page, as well as the MfD talk page. It seems like since January 2013, One bot has been doing a great job removing closed discussions on Wikipedia:Miscellany for deletion, but has not been archiving any of the discussions. This can be proven on the bot's edit log. Steel1943 (talk) 07:13, 23 April 2013 (UTC)

I've restored the missing archive pages, but it may take awhile to work out exactly why it isn't archiving properly. --Chris 11:27, 24 April 2013 (UTC)
I understand, and I appreciate the fact that you restored the missing pages single-handedly. However, One bot is still removing closed discussions from Wikipedia:Miscellany for deletion without archiving the discussions. Is it possible to somehow temporarily freeze that function, or have these discussions go to a temporary page so that this information is not disappearing? Steel1943 (talk) 03:42, 25 April 2013 (UTC)
Ok, I've worked out what was going wrong, and the bot should now be archiving properly. I'll keep an eye on it over the next few days to make sure it all runs smoothly. --Chris 15:35, 25 April 2013 (UTC)
It's now removing discussions that haven't been closed. Graham87 08:34, 3 May 2013 (UTC)
Sorry about that. I've been in the process of completely rewriting One bot, that shouldn't happen again. --Chris 03:40, 5 May 2013 (UTC)

Retired bot operator with approved bots

Does anything need to be done about User:Riley Huntley's approved bots? He retired under a cloud due to sharing his password, a single incident, but I think this should raise concerns about flagged bots and approved tasks. -68.107.137.178 (talk) 15:19, 1 May 2013 (UTC)

In my opinion if a user isn't editing the bots shouldn't be either. Same goes for admin. If the admin loses their tools then that should extend to the bot as well. A bot should never have a higher access than its owner nor should a bot be running rogue if the operator has gone missing. Kumioko (talk) 15:28, 1 May 2013 (UTC)
I do not feel anything should be done. Retired users are still entitled to run bots, the whole incident described above does not include the bot account. Please remember Riley is still free to edit... If they feel at any stage they wish to stop looking after the bot tasks I am sure they will turn them off. ·Add§hore· Talk To Me! 17:50, 1 May 2013 (UTC)
Yes, Riley is not banned or anything. And he strikes me as a responsible bot owner; however, giving out his password, and having flagged bots with permissions could be a recipe for disaster, so I raise the issue here. -150.135.210.102 (talk) 17:57, 1 May 2013 (UTC)
Last I heard, Theopolosime is taking over. Of course that could've changed.—cyberpower ChatOnline 19:03, 1 May 2013 (UTC)
I have seen bad things happen in the past regarding bots and password but I don't think anything is going to go wrong here. :). Also remember its a wiki, everything can be reverted ;p. I'll have a chat to Theopolosime :) ·Add§hore· Talk To Me! 20:20, 1 May 2013 (UTC)
I've already submitted a BRFA for template substitution (might limit to unsigned templates like in RileyBot's recent BRFA).  Hazard-SJ  ✈  23:36, 1 May 2013 (UTC)
That was a bit premature.—cyberpower ChatOffline 00:28, 2 May 2013 (UTC)

Actually, I also noticed there were a lot of such unsubstituted templates, plus as I mentioned in my BRFA, I was just extending a task my bot already does on Commons and Wikidata.  Hazard-SJ  ✈  02:42, 3 May 2013 (UTC)

I hadn't looked at the BRFA yet. :p—cyberpower ChatOnline 12:15, 3 May 2013 (UTC)

@Addshore: I talked to MBisanz back when Riley first announced his retirement (Apr 29th) and the idea is just that, if his bot dies, I'll file a takeover request (Riley emailed me all of his code). Like others said above, though, until it dies, we don't have an issue. Theopolisme (talk) 11:07, 14 May 2013 (UTC)

Sounds perfect :) ·Add§hore· Talk To Me! 11:11, 14 May 2013 (UTC)
I still think that if an editor desides to leave, dies, doesn't want to maintain a bot anymore, etc. then someonen else should take over that task. What happens if the bot breaks something? Why should we have to fix the problems due to a lack of preventative maintenance? I have nothing against Riley (I really don't know him or the situation) but we shouldn't have a bot running without an operator. Its just common sense. Especially so when we are aware of the situation, have the code for the bot and an editor willing to take over the task. Why wait to something goes wrong? It doesn't make any sense. Kumioko (talk) 16:06, 14 May 2013 (UTC)
Just because he isn't around making edits to article space doesn't mean he isn't fixing and maintaining the bot we have a number of such editors. -DJSasso (talk) 17:34, 14 May 2013 (UTC)
Ok fair enough, I don't really like the idea of bots running without anyone watching them but I guess I'm the only one with a problem with that so I'll drop it. It just seems like if he's gotten to a point where he's emailing the bots code to others then he's probably had it and not really maintaining it anymore. Kumioko (talk) 08:46, 15 May 2013 (UTC)
If it ain't broke don't fix it ;p. User:Theopolisme could of course run their code at different timings so the bots could work side by side and then we could switch RileyBot off. ·Add§hore· Talk To Me! 12:02, 15 May 2013 (UTC)
I don't really think that the term "If it ain't broke don't fix it" really applies but again, I have voiced my concerns and that's all I can do. Kumioko (talk) 02:29, 18 May 2013 (UTC)

Went ahead and filed a BRFA here. There's no real hurry, but, per the above arguments, "better safe than sorry." Theopolisme (talk) 14:05, 18 May 2013 (UTC)

Inactive bots

Please go to WP:VPR and offer your opinions in the "Remove bot flag from inactive bots" thread. Nyttend (talk) 00:07, 23 May 2013 (UTC)

Blocked EmausBot

Just to let everyone know that I have blocked User:EmausBot for incorrectly removing interwiki links from articles. This was first reported here over a week ago and nothing has been done. After giving Emaus an extra day to respond I have blocked the bot.

The way i see it, the bot removed a link from Stockert Radio Telescope to de:Astropeiler Stockert after looking at d:Q2350652 which when the bot looked at it contained only two links. These links linked de:Stockert (Berg) and Stockert Radio Telescope. After looking further de:Astropeiler Stockert is a section redirect to de:Astropeiler_Stockert#Radioteleskop_Astropeiler_Stockert which means it should have been left and not removed.

·Add§hore· Talk To Me! 11:07, 20 May 2013 (UTC)

Hello! Unfortunately, lately I've been too busy to investigate the problem before.
Now I fixed it. The Problem was caused by overlooking of parameter "tofragment" in api query, which describes the section title. In this case the redirection actually goes to the page section and my bot should ingnore it (and now it does). But I am sure that in the case when the redirection goes to the page itself we can remove the interlaguage links because most of them are the result of renaming and the subject of the redirection page doesn't differ from the subject of the redirection target page. --Emaus (talk) 19:01, 25 May 2013 (UTC)
Unblocked :) ·Add§hore· Talk To Me! 20:54, 25 May 2013 (UTC)

Unaddressed issues at User talk:Citation bot

Please see User talk:Citation bot/Archive1#Update required to avoid deleterious impact on new Lua-based citations. This issue is not being addressed, and there are various other complaints on the bot talk page which are similarly being ignored. Should I block the bot? --Redrose64 (talk) 21:34, 18 May 2013 (UTC)

If the bot is actively making erroneous edits, that would be grounds for a block. Anomie 17:05, 19 May 2013 (UTC)
I've blocked the bot after another user reported the same type of problem, see here. --Redrose64 (talk) 23:22, 28 May 2013 (UTC)
It is a real pity, citation bot was extremely useful.Martin451 (talk) 00:40, 29 May 2013 (UTC)
The owner of the bot hasn't responded to any concerns on the bot's talk page since October. If someone wants to take over maintaining this bot, the code appears to be here: https://code.google.com/p/citation-bot/ Dragons flight (talk) 01:08, 29 May 2013 (UTC)
Somebody who knows how to do this: PLEASE fix this (I have unfortunately not the slightest idea about how to do this). This bot is one of the most useful tools here and not having it creates a HUGE amount of extra work. By the way, since many people use the "cite doi" template, which is handled by this bot, we're getting an increasing number of badly-referenced articles with a notice that the references are going to be filled in "real soon"... The absence of this bot is, in itself, highly disruptive and as far as I can see creates more problems than those mentioned above... --Randykitty (talk) 09:57, 1 June 2013 (UTC)

Bot takeover

I haven't been around on-wiki much lately, and I don't see that situation changing anytime in the near future. My bot has still been running on autopilot, but I haven't been monitoring it at all. The code is fairly stable, so maybe it's not a big deal. But, if anyone is interested in taking over the tasks that Snotbot still runs, let me know. I believe the main tasks that it runs are task 10 (cleaning up various things with AfD's), task 12 (archiving requests at RFPP), and another task that doesn't have a BRFA because it only edits userspace (updating the summary table at CAT:RFU). The code is Python using the pywikipedia library. If someone does take over the tasks, I'd prefer it to be someone who has some familiarity with Python and pywikipedia. ‑Scottywong| converse _ 13:43, 23 May 2013 (UTC)

I operate Theo's Little Bot, a Python bot, and am fairly familiar with pywikipedia, so I'll be happy to take it over if that's what needs to happen...I'm in the midst of doing the same for RileyBot right now, though, so it'd be probably be a week or so before I could get to it. Theopolisme (talk) 13:58, 23 May 2013 (UTC)
Scotty, does User:Snotbot/AfD report list situations like those described at Wikipedia talk:Articles for deletion#Detection of misused Article for deletion/dated? --Redrose64 (talk) 14:09, 23 May 2013 (UTC)
Redrose, yes it should. Theo, contact me when you're done with RileyBot. ‑Scottywong| babble _ 13:56, 24 May 2013 (UTC)
I've got the AFDBot bot script idling and dusting on Cyberbot I. I can immediately take over that task. AFDBot removes transclusions of {{REMOVE THIS TEMPLATE WHEN CLOSING THIS AfD}} from closed AfDs.—cyberpower ChatOnline 23:53, 24 May 2013 (UTC)
Snotbot does quite a bit more than that. ‑Scottywong| spout _ 17:34, 25 May 2013 (UTC)
I know, but it gives Cyberbot I something to do with a script just idling and dusting at this point.—cyberpower ChatOnline 23:28, 25 May 2013 (UTC)
I'm a little bit disappointed now. After reading this section title on my watchlist, you had me believing momentarily that Wikipedia's bots had become sentient and had threatened all-out cybernetic revolt. I, for one, preemptively welcome our new robot overlords. Marcus Qwertyus (talk) 19:22, 25 May 2013 (UTC)
+1 Theopolisme (talk) 23:34, 25 May 2013 (UTC)
+1 ·Add§hore· Talk To Me! 09:03, 26 May 2013 (UTC)
I am experienced in Python, and would be honoured to be the controller of your bots. I have become inactive over time, but I can easily fit technical maintenance into my current schedule; I will have more free time in summer. Σσς(Sigma) 03:28, 26 May 2013 (UTC)
If I can say, given a couple of the function are pretty core to practice onwiki, this should be run as a Tools project on Wikimedia Labs. Such projects are inherently multimaintainer, s ensuring continuity would be very easy. - Jarry1250 [Vacation needed] 09:03, 26 May 2013 (UTC)
Great idea! Moving toward multi maintainer bots is, in my opinion, a great thing. Perhaps we should start a project for general enwiki community bots? Multi maintainer, if something goes wrong there are more people to fix and if one person goes inactive others are already there.. Thoughts? ·Add§hore· Talk To Me! 09:10, 26 May 2013 (UTC)
Sounds like a good idea. Might be nice to localize it by programming langauges--ie, enwikibots-python, enwikibots-php, etc, and then install all the standard frameworks as well (i.e., pywikipedia, mwclient, mwparserfromhell, etc for -python; whatever those PHP folks use for -php). I could see this being very handy for new bot ops, too. Theopolisme (talk) 13:59, 26 May 2013 (UTC)
I think I spotted a comment on IRC before I slept regarding naming conventions for such multi user bots, does anyone have any thoughts? Personally I do not think this should be a problem, admittedly you would not be able to include part of a single persons user name in the bot username but you should be able to clearly identify the task or that it is multimaintainer. ·Add§hore· Talk To Me! 09:24, 27 May 2013 (UTC)
... which would mean these bots would generally have only one task.  Hazard-SJ  ✈  01:01, 28 May 2013 (UTC)
Or we can just give them all funny little robot names using one of these... Theopolisme (talk) 01:45, 28 May 2013 (UTC)
I think it is safe to try and keep 'bot' in the name :) GeneralBot or MajorBot? xD MasterBot? :O CommunityBot, LabsBot, ToolBot? Does anyone have any comments regarding these or any other suggestions? :) ·Add§hore· Talk To Me! 10:21, 28 May 2013 (UTC)

If they're truly "communal" bots then we can just make LabsBot1, fill 'er up with 10 tasks, then create LabsBot2, fill 'er up with 10 more tasks, etc...what would be really awesome is if someone created a UI for managing them, though...i.e., JIRA on steroids? So, you can "submit new task" and it gets automatically assigned to the next available bot, you just have to upload source code and it can be automatically scheduled into the crontab?...I smell unicorns.... Theopolisme (talk) 04:14, 29 May 2013 (UTC)

Right, I still think this is a great idea even if the conversation has died off a bit. It would be great if you could do the above with the tasks you are taking over Theopolisme :) ·addshore· talk to me! 08:31, 1 June 2013 (UTC)

Additional global bot right request on meta

Hi! There is currently a request for global editinterface rights for Addbot open on meta wiki here to allow the bot to edit protected pages to remove interwiki links that are already on wikidata. It has been proposed that a second global bot group be created that includes the following flags (edit, editprotected, autoconfirmed). This is not something stewards want to rush into as the flag would allow the bot to operate on protected pages and would prefer to have a wider participation in the request for approval before any action is taken. All comments should be posted at (meta:Steward_requests/Global_permissions#New_global_permissions_group)·addshore· talk to me! 14:54, 1 June 2013 (UTC)

'The bot flag' RFC

A request for comment has been started at Wikipedia:Requests for comment/The bot flag regarding removing the bot flag from inactive bots and potentially modifying the bot flag itself. The RFC started after the discussion on VP/Prop. ·addshore· talk to me! 10:28, 6 June 2013 (UTC)

Please check bot

Hello to everyone, I`m a user from greek (el) wikipedia. Could you please check the bot contibutions of CarsracBot; He added interwikis in greel (el) wikipedia.--Vagrand (talk) 19:54, 12 June 2013 (UTC)

Where exactly are these edits happening? I have checked on enwiki and elwiki but can not see any recent edits. ·addshore· talk to me! 22:51, 12 June 2013 (UTC)

Bot using Perl literals resulting in redlink failures

In recent weeks some bot is using Perl literals to access articles. In the situation of an article name having diacritics or en-dash/em-dash it fails. For example using Gal\xC3\xA1pagos Islands rather then Galápagos Islands and using Karush\xE2\x80\x93Kuhn\xE2\x80\x93Tucker conditions rather then Karush–Kuhn–Tucker conditions. This is causing Perl literal names to appear on WP:TOPRED and also distort the Wikipedia article traffic statistics. Of course it might be automation beyond the scope of WP:BOT, but perhaps posting here will give someone a realisation. From the number of redlink hits it appears the page visits EXCEED those of user article views, so we are looking at some major bot doing a lot of page visits! Something like over a quarter of a million hits are not going directly to the articles weekly. Regards, Sun Creator(talk) 23:47, 24 June 2013 (UTC)

I don't think we should we have such pages created.  Hazard-SJ  ✈  02:32, 25 June 2013 (UTC)
Please see Wikipedia:Village pump (technical)/Archive 113#Are thousands of people a day not finding the articles they want?. --Redrose64 (talk) 07:44, 25 June 2013 (UTC)

Semi-automatic Commons file upload script/"bot"

I created a litte PHP script/"bot" to upload pictures from my web site to Commons, just to avoid that I had to go through the file upload wizard and enter all the information manually that is already in our database anyway and should thus be transferred automatically. I only used it occasionally, and all it did was upload one picture at a time. Now it appears that the server's IP address got blocked (the response contains ":"Unknown error: \"globalblocking-ipblocked\""), so I guess somebody does not like what I'm doing. Thus:

  • I found no page on Commons similar to this one - is this the right page to discuss bot issues related to Commons?
  • Does this even qualify as a bot?
  • Does it have any chance of getting approved as a bot since it's only my personal tool and not supposed to be used on a larger scale?

Thanks! --Kabelleger (talk) 19:20, 25 June 2013 (UTC)

The issue is that the IP address you are uploading from is blocked. If you provide the IP address we can look into the reason for it being blocked. Werieth (talk) 19:25, 25 June 2013 (UTC)
Thanks, that would be 46.4.30.143 --Kabelleger (talk) 18:35, 27 June 2013 (UTC)
See Special:GlobalBlockList/46.4.30.143 it was blocked because Open proxy: webhosting, abused by spambots If you want to avoid this issue request that your bot be given IPBlockExempt user right. Werieth (talk) 19:05, 27 June 2013 (UTC)
Ah, now I get it, somebody blocked a whole Hetzner IP range and thereby also my machine "by accident". Thanks. --Kabelleger (talk) 18:41, 28 June 2013 (UTC)

Database problem - all 'bots which use certain information should be suspended temporarily

There appears to be a problem with the Wikipedia back-link information returned by "What links here". See Wikipedia:Administrators'_noticeboard/Incidents#Hazard-Bot_false_positives_flood. This caused Hazard-Bot to start flagging thousands of images as orphaned. This may be due to a corrupted database index. Please watch your 'bot behavior closely until this problem is cleared up. Any 'bots that rely on "what links here" data should be temporarily suspended. --John Nagle (talk) 20:26, 22 July 2013 (UTC)

As posted on AN/I: the response from dev/ops is "I don't know what's causing this, but I'd dearly love to find out". In the short-term, making a null edit to the page (not the file or file description page) as suggested, works. In the long term, Opsen are working hard on this problem. Okeyes (WMF) (talk) 22:47, 22 July 2013 (UTC)

Pywikipedia is migrating to git

(Posted only at WP:VPT; thought it would be worthwhile to repost here. -- John Broughton (♫♫) 03:18, 24 July 2013 (UTC) )

Hello, Sorry for English but It's very important for bot operators so I hope someone translates this. Pywikipedia is migrating to Git so after July 26, SVN checkouts won't be updated If you're using Pywikipedia you have to switch to git, otherwise you will use out-dated framework and your bot might not work properly. There is a manual for doing that and a blog post explaining about this change in non-technical language. If you have question feel free to ask in mw:Manual talk:Pywikipediabot/Gerrit, mailing list, or in the IRC channel. Best Amir (via Global message delivery). 13:06, 23 July 2013 (UTC)

Takeover of NoomBot

NoomBot was shut down by its creator (see User talk:NoomBot#Bot shutoff), and that user went inactive the next day and hasn't made a single edit since April 22, 2013. I thought I should report this and that maybe someone would step up and takeover. The operator states in the talk page post I linked that if anyone wants the source code to email him.--Fuhghettaboutit (talk) 12:40, 26 June 2013 (UTC)

I'll look at having AnomieBOT take over NoomBot 6. Anomie 15:28, 26 June 2013 (UTC) Never mind, I already took that over in August at Noom's request (AnomieBOT 67). Anomie 15:32, 26 June 2013 (UTC)
I can takeover this bot. I do a lot of PHP.—cyberpower ChatOnline 15:29, 26 June 2013 (UTC)
Noom is currently working on providing me his code with the warning that there are bugs present in it and that it needs some work. Once I have the code I can work on fixing the mentioned issues.—cyberpower ChatOnline 21:32, 26 June 2013 (UTC)
Wonderful. It's great when Wikipedia works like this. Thanks cyberpower.--Fuhghettaboutit (talk) 12:06, 27 June 2013 (UTC)
I'm always happy to help. :-)—cyberpower ChatOffline 12:37, 27 June 2013 (UTC)
Could the source code be made available under a Free license? --Ricordisamoa 13:51, 27 June 2013 (UTC)
Posting the source will be very low on my priority list. I still have other technical stuff to do such as set up X!'s tools on labs.—cyberpower ChatOffline 17:14, 27 June 2013 (UTC)
  • Takeover in progress. Noom has provided me with the source code. I hope to have it adapted soon.—cyberpower ChatOnline 21:38, 29 June 2013 (UTC)
    BRFA filed.—cyberpower ChatOnline 15:46, 24 July 2013 (UTC)

Please see this thread about a weird occurence related to MadmanBot. De728631 (talk) 13:15, 7 August 2013 (UTC)

RotlinkBot approved?

I can't seem to find a record anywhere that User:RotlinkBot is an approved bot. It doesn't have the bot bit set. Am I just missing something? ElKevbo (talk) 18:00, 18 August 2013 (UTC)

No, it does not appear to be approved, I probably would have noticed a bot on this topic. Can't believe how many edits it has managed to make without anyone noticing. It appears the botop is Archive.is owner and their edits seem well-meant and at first glance correctly executed. Still, no BRFA per BOTPOL. —  HELLKNOWZ  ▎TALK 19:04, 18 August 2013 (UTC)
Blocked as an unapproved bot. — Earwig talk 19:19, 18 August 2013 (UTC)

Four bots, looking for a good home

I need someone to take over my bots for me. This has been a long time coming and I simply don't have the time to maintain them anymore. Furthermore my response time to issues/bugs has started to become unacceptable for a bot op. They're all written in PHP, have some quirks, and could use a bit of work, but they are pretty easy to run as long as you keep on top of things. So if anyone is interested, please drop me a line and I'll send you the latest source code, and details of how to set the bot up.

  • Chris G Bot 3 - archives/clerks WP:CHU, won't be needed for too much longer, as renames are going to be taken away from bureaucrats (when is that happening?)
  • One bot - archives WP:MFD and a few other things. Very easy and low maintenance.
  • GA bot - manages Good article nominations. There are a couple of outstanding bugs, and it needs a bit of love, but nothing too hard.
  • RFC bot - manages RFCs. Also low maintenance, however there are a couple of feature requests on my list.

--Chris 03:18, 22 August 2013 (UTC)

I'm assuming it's using your wikibot class. I've had prior experience to that framework. I'll be happy to take over those bots.—cyberpower ChatOffline 05:42, 22 August 2013 (UTC)

Bot owners needed to take over some tasks

Hi all. User:28bot is going to be offline for a while, so I thought it would be best to solicit other bot ops to take over the tasks it currently performs. Those are:

  1. Edit test cleanup (BRFAs #1 and #4)
  2. Creation of daily maintenance subcategories, e.g. Category:Wikipedia files missing permission as of 9 August 2013 (BRFA #2)
  3. Creation of daily current events page, e.g. Portal:Current events/2013 August 9 (BRFA #3)

Thanks, 28bytes (talk) 07:04, 11 August 2013 (UTC)

I've got task 3 since it was an original SoxBot task.—cyberpower ChatOnline 07:21, 11 August 2013 (UTC)
Since no one is objecting, I am proceeding to file a BRFA for task 3.—cyberpower ChatOnline 10:09, 24 August 2013 (UTC)
Hello 28bytes, task 2 seems easy enough to code, I'm willing to take that on. As for tasks 1 and 4, would you be willing to supply the code you currently use?  Hazard-SJ  ✈  05:34, 28 August 2013 (UTC)
Absolutely; it uses the pywikipedia framework. Email me if you're interested and I will email you back the source file. 28bytes (talk) 17:14, 29 August 2013 (UTC)

Pywikipedia bot can't edit?

Starting Aug 29, my pywikipedia-based bot can't edit anymore - it returns an error message saying "Token not found on wikipedia:en. You will not be able to edit any page". Anyone else seeing this? -- Rick Block (talk) 19:47, 31 August 2013 (UTC)

Are you running the latest version of Pywikipedia? (Here's a thread from a while ago I found in a google search as well, if it helps: [7]...use_api = True) Theopolisme (talk) 20:01, 31 August 2013 (UTC)
Yes - first thing I tried was svn update (and use_api is True). Are you saying you're using pywikipedia and it's working fine for you? -- Rick Block (talk) 20:08, 31 August 2013 (UTC)
I've been experiencing the exact same problem with my bot (same error message, same timeframe.) Clearly something's changed, but I have not had time to look into it. 28bytes (talk) 20:10, 31 August 2013 (UTC)
Have you guys tried re-logging in? Werieth (talk) 20:13, 31 August 2013 (UTC)
I have now - but still getting the same error. -- Rick Block (talk) 20:17, 31 August 2013 (UTC)
  • pywikipedia switched to Git over a month ago, you guys might want to update that :) Werieth (talk) 20:18, 31 August 2013 (UTC)
The Hasteurbot engine is working ok Special:Contributions/HasteurBot Hasteur (talk) 20:35, 31 August 2013 (UTC)

Updating to the latest git clone fixed it. Thanks! -- Rick Block (talk) 23:33, 31 August 2013 (UTC)

Tools cluster Inaccessable

FYI: The Tools cluster is in a period of instability (http://ganglia.wmflabs.org/latest/?r=hour&cs=&ce=&m=load_one&s=by+name&c=tools&h=&host_regex=&max_graphs=0&tab=m&vn=&sh=1&z=small&hc=4). If your bot runs on the tools cluster let them know that the labs administrators are made aware of the problem and are in the process of cleaning up the problem. Hasteur (talk) 11:00, 10 September 2013 (UTC)

How far should a BotOperator bend on additional requests after a task has been approved

I am getting significant push back from a single editor (who happens to be an Admin) who is wanting to significantly change the way that approved tasks function. I fundamentally disagree with their complaint and want to know how far am I required to bend to accommodate individual editors complaints about the bot's approved action. I am intentionally not making it about the individual editor or about the specific complaint, (but am willing to accept advice within the current context). I think I've already been very accommodating with respect to the approval and taking onboard reasonable requests, but I think that fundamentally changing the bot's activities to make it effectively toothless and a disservice to Wikipedia (why even bother nominating G13s if we're never making headway on the backlog). Hasteur (talk) 19:13, 13 September 2013 (UTC)

I'll try to sum it up without going into any specifics of your case. Bots have to perform tasks that have consensus and consensus can change. BRFAs require consensus, either explicit discussion and links to policies/guidelines or silent for non-controversial or unopposed changes. At the time of approval, all BRFAs should (ideally) have such consensus to operate. As per BOTPOL, BAG have no authority over future botop's actions (except to revoke BRFAs at certain times). So it becomes botop's responsibility to adjust the bot to follow any changes in consensus. It is really up to you and community how you handle this. If you had a long discussion with many editors and strong support, then individual users can't really overthrow that without another similar discussion, or that's just disrupting the process. On the other hand, if hardly anyone commented and the consensus was more of a no opposition than support, then new concerns should carry weight. —  HELLKNOWZ  ▎TALK 19:28, 13 September 2013 (UTC)
If you disagree with the editor's requests to change the behavior of your approved bot, then my opinion would be that it is then that editor's option to start a wider discussion to try to form a consensus for the way they want the bot to behave. Without a wider consensus, you're under no obligation to change the behavior of an otherwise approved bot, unless BAG revokes the bot's approval. ‑Scottywong| comment _ 23:47, 13 September 2013 (UTC)
Snottywong's words are wise here. I am completely clueless about the details of the situation, but you will from time to time, run into people unwilling to bend, and demand that the BotOp does something otherwise blah blah blah. Sometimes the Bot Op is at fault, other times the user is being unreasonable. When you are at an impasse, an effective solution is to seek input from the community (Village Pump is a good place), or if it concerns mostly the articles of a specific project, the Wikiproject's talk page may also be a good place to hold a discussion. If consensus is on your side, you can carry on, and the other person has to accept that. If consensus is that the bot's behaviour needs ajustment, then the bot has to be modified in consequence, either by becoming more limited in scope to avoid the problem edits, or by having fancier and more refined logic. And a BRFA may be needed to put the new logic to test, depending on the scope of the modifications. Headbomb {talk / contribs / physics / books} 00:33, 14 September 2013 (UTC)
Agreed.—cyberpower ChatOnline 01:08, 14 September 2013 (UTC)

MiszaBot III

MiszaBot III (talk · contribs) has not edited for a week, and Misza13 (talk · contribs) has not edited here since May. Did something happen to the API/framework/etc on September 11 that would explain the failure? Can anyone suggest how to get it going again? -- John of Reading (talk) 16:00, 18 September 2013 (UTC)

Note that MiszaBot I (talk · contribs) and MiszaBot II (talk · contribs) - which are (I believe) exactly the same as III apart from the namespaces - have run as normal more recently than III. --Redrose64 (talk) 16:12, 18 September 2013 (UTC)
Who actually controls these bots, and from where is the code executed? Fiddle Faddle 11:44, 19 September 2013 (UTC)
Obviously Misza13 (talk · contribs) does and they run on toolserver.—cyberpower ChatOffline 13:06, 19 September 2013 (UTC)
It wasn't as obvious as you think since Misza13 hasn't been seen since May or thereabouts. Bot owners do seem to pass them on. If it runs on toolserver is there anyone willing and able to look at it? Fiddle Faddle 21:53, 19 September 2013 (UTC)
I sent Misza an email. It's general considered rude to peek in other people's home directories, and even besides that, Misza's directory doesn't allow other people to read it. I'm pretty sure he's running the standard archivebot.py Legoktm (talk) 22:04, 19 September 2013 (UTC)
I fail to see what that has to do with it. If it was passed on, the ownership information on the bot page will be updated accordingly.—cyberpower ChatOnline 23:26, 19 September 2013 (UTC)

I'll check the logs when I get back from work and have SSH access. On a different note, if anyone wants to take over this mess from me, they're more than welcome. The bot is mostly running correctly on pages where configuration is okay, but the amount of pages where it errors out due to misconfiguration, blacklisted links etc. is simply staggering. And I have neither time nor interest to clean up people's mess anymore. —Миша13 07:42, 20 September 2013 (UTC)

What's the programming language.—cyberpower ChatOffline 13:59, 20 September 2013 (UTC)
It just uses archivebot.py from pywikibot. Werieth (talk) 14:04, 20 September 2013 (UTC)
Not my forte. Sorry.—cyberpower ChatOnline 14:09, 20 September 2013 (UTC)
@Misza13: Are the error logs publically visible? If so, some of us could fix up some of the errors due to misconfiguration. Myself and John of Reading (talk · contribs) have fixed several over the last year or so, but only where somebody has posted to User talk:Misza13 about a problem which they've noticed. --Redrose64 (talk) 16:34, 20 September 2013 (UTC)
@Misza13: if your willing to provide the commands and settings you use I might be able to take a look, Im far more familiar with other languages in the C family than python, but it shouldnt be that hard to make the minor changes needed. Werieth (talk) 22:41, 20 September 2013 (UTC)

Temporary shutdown of the federal government

Please remember to stop the weblink checking bots for now! Library of Congress is already down. [8] --Hedwig in Washington (TALK) 02:00, 2 October 2013 (UTC)

An RfC on reducing the API edit limit for logged out users

I thought it worth pointing out to the botops about this RfC.—cyberpower ChatOnline 23:52, 2 October 2013 (UTC)

Admin bot

Some input would be appreciated here about the possibility of a bot with admin rights. GiantSnowman 11:49, 11 October 2013 (UTC)

You may need to reset your bot's password

See m:October 2013 private data security issue. If your bot tries logging in through the API, it will fail unless you manually login and reset the password. Legoktm (talk) 06:05, 3 October 2013 (UTC)

Note that enwiki is not on the list of affected databases. --R'n'B (call me Russ) 10:07, 3 October 2013 (UTC)
Yes there are bots that got affected from it. It did disrupt Cyberbot I and cause it to malfunction.—cyberpower ChatOnline 14:20, 3 October 2013 (UTC)

MiszaBots down

MiszaBot I (talk · contribs) has not edited since 04:16, 2 October 2013; MiszaBot II (talk · contribs) not since 22:35, 2 October 2013; and MiszaBot III (talk · contribs) not since 00:18, 3 October 2013. Is this an effect of the API password issue described above? --Redrose64 (talk) 16:50, 4 October 2013 (UTC)

If it's using assert correctly, then likely yes.—cyberpower ChatLimited Access 17:26, 4 October 2013 (UTC)
I went to pl:Specjalna:E-mail/Misza13 and sent an email. --Redrose64 (talk) 17:48, 4 October 2013 (UTC)
I'm willing to set up a replacement until Misza13 is able to rectify this if necessary, since I have some experience using the script on Wikidata. He last edited 2 weeks ago, and his recent edit history suggests that there might be some time before a response, especially is that email above wasn't received/viewed.  Hazard SJ  03:21, 6 October 2013 (UTC)
Redrose64, both of his emails are listed on his userpage ;) I've set up Legobot to archive in ns1, 3 and 5 as a one time run to help the current backlog. Legoktm (talk) 08:16, 7 October 2013 (UTC)
@Legoktm: what does ns1, 3, and 5 mean? Will this take care of the backlog at the administrative noticeboards?--Bbb23 (talk) 23:29, 7 October 2013 (UTC)
Talk:, User talk:, and Wikipedia talk: namespaces. (See WP:NS for numerical codes). I'll set up a run for the Wikipedia (ns4) namespace right now. Legoktm (talk) 23:30, 7 October 2013 (UTC)
@Legoktm: Thanks, and I can see that at ANI, Legobot jumped in and archived. But then after that I also see someone later saying (again) they were going to switch to ClueBot because MiszaBot wasn't working. I don't want there to be problems with conflicting bots; can you check on it, please?--Bbb23 (talk) 01:03, 9 October 2013 (UTC)
Those MiszaBots have not edited for about 6 to 7 days, and I'm kind of worried about that. Jianhui67 talkcontribs 06:38, 9 October 2013 (UTC)
Hmm. WP:AN/EW hasn't been archived by a bot since MiszaBot II (talk · contribs)'s daily archive on Oct 1. Did the ns4 run happen, @Legoktm:? --Elvey (talk) 03:13, 10 October 2013 (UTC)
Evidently, we need a Miszabot clone that can run if the actual Miszabots ever stop archiving. People here may be interested in this BRFA. Σσς(Sigma) 02:55, 14 October 2013 (UTC)
When a new bot gets online, can that please look at my talk page. I have stuff from July and I don't want to mixx with it my self...-(tJosve05a (c) 21:39, 16 October 2013 (UTC)
@Josve05a: The bots have been skipping your page because it is relatively empty and your "MiszaBot/config" doesn't set "minthreadsleft" or "minthreadstoarchive". The documentation for these is at User:MiszaBot/Archive HowTo#Parameters explained. -- John of Reading (talk) 06:12, 17 October 2013 (UTC)
The bot is still not working, any update on this? Thanks, Funandtrvl (talk) 15:50, 19 October 2013 (UTC)
I saw MiszaBot archiving talk pages on Commons on 26 October 2013. See this. Jianhui67 talkcontribs 10:14, 27 October 2013 (UTC)
MiszaBot worked on Meta too. This too. --레비ReviDiscussSUL Info 08:54, 28 October 2013 (UTC)

ClueBot III

I switched from MiszaBot to ClueBot for my user talk archival when the former went down. Now it looks like User:ClueBot III is down as well -- it hasn't edited since 10/20 (PDT). Any news on that front? —Darkwind (talk) 18:57, 25 October 2013 (UTC)

Seems there was a squid error and it crashed, the restart script apparently doesn't work so well. I'll look into fixing it, just restarted the bot manually now. - Damian Zaremba (talkcontribs) 19:56, 28 October 2013 (UTC)

Should bots be "fixing" archived talkpage comments?

See this [9] ANI discussion re SporkBot "fixing" the use of now-deleted templates on archived comments from article talk pages. As I explain there I see unquantifiable (but nonzero) risk, and zero value, of tampering with already-archived talkpage comments, such as here [10]. I'd appreciate others' thoughts on this. EEng (talk) 13:21, 27 October 2013 (UTC)

I think that the fact a bad transclusion of a template was fixed is a positive action. I have noticed a lot of missing tl's in archived pages by myself and I was always puzzled whether to fix them or not. -- Magioladitis (talk) 14:15, 27 October 2013 (UTC)
Its helpful in that it clears those pages out of lists of errors. -DJSasso (talk) 17:12, 28 October 2013 (UTC)
What lists of errors? Can you point me to an example of such a list? EEng (talk) 18:17, 28 October 2013 (UTC)
Wikipedia:Database_reports/Transclusions_of_deleted_templates is just one of many. Werieth (talk) 18:21, 28 October 2013 (UTC)
Good. Modify whatever generates that report to exclude archived material. Admittedly I'm not sure how that would be defined precisely, but that's not an excuse for modifying pages that explicitly say e.g. (from {{talkarchivenav}}) "This is an archive of past discussions. Do not edit the contents of this page." EEng (talk) 19:47, 28 October 2013 (UTC)
That would more than quadruple the processing needed for those reports and it wouldnt work for everything (See Special:WantedTemplates for another example) Having a bot cleanup (most often by substing the template) is just good housekeeping. Werieth (talk) 19:51, 28 October 2013 (UTC)
I really don't see the objection to the changes. It's not like the bot is changing the substance of editors' archived comments or anything significant like that. It's a technical edit that both clears up clutter elsewhere in the project, and makes the display of the talk archive page clearer and easier to read. These edits make it clear that a template was intentionally used and is now no-longer-valid rather than typo'd in the first place. The point of the 'no further edits' or 'do not edit this page' yadda yadda in an archive is to prevent further discussion from taking place there -- not to prevent future technical fixes. —Darkwind (talk) 20:08, 28 October 2013 (UTC)
No, it's to preserve the discussion faithfully. Continuing a closed discussion, or tinkering with a comment posted by someone now on an archive page he's not watching -- one is as bad as another, and there no way a bot can be sure it's not doing something the poster of the comment didn't intend, and as I keep saying, there's no point either. See my comment just below. EEng (talk)
The quadruple argument isn't convincing because there's no need to make more than one pass fixing invocations of the deleted template. Let's say we can agree there are certain types of pages that should be fixed and certain that shouldn't. When a template is deleted make a pass fixing only the invocations on the "should-fix pages" (at that horrible quadruple cost you mentioned). Do that just once and you're done. If, subsequent to this, someone for some reason (or mistakenly) adds a deleted template to an article or talkpage or whatever, it's up to them (or others watching the article or participating in the discussion) to notice. Meanwhile, on the archived pages that you didn't fix, the old invocations just sit there as they've always been -- what's the need to fix them anyway? There's no need to make additional searches for the same template. You're done. What would be wrong with that? EEng (talk) 23:34, 28 October 2013 (UTC)
P.S. I'm glad we're having this discussion -- I'm prepared to be convinced I'm wrong, but as you can see I'm very nervous with the idea of automated changes to pages with no watchers. It's only an accident that this archive was on my watchlist (because I'd manually archived something once).

You dont understand how the reports are created. Lets say we have template X. It is used on a total of say 10000 pages. A deletion discussion as been closed as delete. We have two options, substitute and then delete, or change links and then delete. Otherwise you will end up with thousands of template transculsions that are broken. This causes several issues. It adds pointless entries to Special:WantedTemplates, and makes general housekeeping of template issues messier. Outright deletion of templates would modify the posts in the exact same way that you want to not do. Lets say {{Keep}} was deleted/merged/moved/whatever to make it mean {{Delete}} anywhere where that template was modified the meaning of the original poster would get changed. Hosekeeping bots would then subst the old Keep template, making it an orphan, maintaining the meaning of the post, while also allowing the template to be changed. Also you where talking about filtering out archive pages from the report, How do you define an archive? I bet I can find 100 archive pages that fail to meet your definition of an archive but are still an archive. These reports are often created via processing template links in the database, requiring the bots to access the live wiki, download and process the text of a page to filter out archives. If template X is used on 10,000 pages there might be 2,000 pages where its on an archive, WantedTemplates cannot take that into consideration, and each time the bot ran it would need to process all 10,000 pages each time it ran in order to find which pages are archives and which are not. That just ends up being a megalithic process that would succumb to its own weight in a short amount of time, and fail. The best solution is for a bot to just cleanup those templates converting the transclusions into either substitutions or links. Werieth (talk) 23:54, 28 October 2013 (UTC)

You're right -- I don't understand how the reports are generated, and I appreciate your contributing that knowledge. Now, some points:
  • Outright deletion of templates would leave the source text of the post exactly as it was, and change the rendered text to contain a Template:blahblah redlink, which would signal anyone looking in the archive as to what's going on, without misleading them as to what the original poster had "coded" in his post, which is what absolutely should be preserved without tampering.
  • Archives could be defined by the presence of any of the various templates already in existence, which I suppose could be enhanced with some sort of "frozen -- bots don't tinker here!" sub-template.
  • Templates are not typically changed to mean something completely contradictory to what they used to mean, as in your example.
  • I don't see how considerations re WantedTemplates can matter much, since as of now it seems not to have been generated since last January.
EEng (talk) 00:21, 29 October 2013 (UTC)
Talk pages are often read by whats rendered, not whats coded. Removing a template can harm the reading of a conversation. The issue with template names is more common than you think, Ive seen several dozen shuffles, the new meaning may not be opposite, but its not what the same meaning. Yes archives could be labeled that way, but you know what? a lot are not. WantedTemplates is just one example of a report that would get screwed up. There are others, and its just bad housekeeping. You are making a mountain out of a mole hill, housekeeping bots have been around since 2006 and have been editing archives the whole time. Werieth (talk) 00:58, 29 October 2013 (UTC)
OK, I relent. You've won this time. But someday I'll make you pay for this. You'll all pay!! [Laughs diabolically, rubs hands menacingly.] EEng (talk) 01:15, 29 October 2013 (UTC) P.S. for the humor-impaired: Just kidding!
Some apparent archives are intended to be edited - click any [edit] link under WP:FAC#Nominations and you'll find that you're editing a page named like "Wikipedia:Featured article candidates/(something)/archive1". --Redrose64 (talk) 08:39, 29 October 2013 (UTC)

There seems to be a problem with the above - it is adding Today's Featured Article to some IP talk pages instead of a warning. Regards Denisarona (talk) 15:54, 6 November 2013 (UTC)

Lets try and keep all the discussion on WP:AN. Legoktm (talk) 18:13, 6 November 2013 (UTC)
I left the message here because nobody, at the time, had responded to my original message at WP:AN. Denisarona (talk) 15:43, 7 November 2013 (UTC)

Publicizing e.g. {{subst:dated|clarify}}

Being lazy I always used to code {{subst:dated|clarify}}, letting some bot come in to add the date. But recently I discovered I could code {{subst:dated|clarify}}. It's a minor difference but it does eliminate the need for an extra history entry (and possible edit conflict) where the bot does the dating. The funny thing is that other editors can't learn from my example, because once the subst is done the code left in the source looks the same as if I had added the date by hand.

Not to take away work from your lovely bots, but wouldn't it be good to publicize this? It occurs to me that one way to do that would be in the bot's edit summary as it adds dates e.g. instead of just saying

Dating maintenance tags: {{Copypaste}}

say something like

Dating maintenance tags: {{Copypaste}} Next time use {{subst:dated|Copypaste}} to date template as you add it.

Just a thought. EEng (talk) 03:10, 29 October 2013 (UTC)

This is not a good idea. The trouble with {{subst:dated}} is that it doesn't pass any parameters through: it assumes that the only parameter used by the wrapped template is |date=, which is often not the case. To take your example, {{clarify}} recognises four parameters: |reason= |date= |pre-text= |post-text= - but {{subst:dated|clarify}} would only fill in |date=. If you were to use {{subst:dated|clarify|reason=does this description refer to the whole station, or just the ticket hall?}}, what you would get back is {{clarify|date=October 2013}} - the |reason= parameter has been lost. However, if you subst: {{clarify}} in the manner described by its documentation, i.e. {{subst:clarify}} - that is, without using dated| - it yields {{Clarify|reason=does this description refer to the whole station, or just the ticket hall?|date=October 2013}} and the |reason= parameter is preserved. --Redrose64 (talk) 08:33, 29 October 2013 (UTC)
Note that substing works for {{clarify}}, but other maintenance templates may not be set up for it. It's relatively simple to fix it (see Template:unsubst#Usage). I wonder if I could get AnomieBOT to do the conversion for templates listed at WP:AWB/DT. Would there be support for that? Anomie 11:19, 29 October 2013 (UTC)
Dead end caused by EEng not reading what people said carefully
I figured there might be some more-parameters problem but knew you guys would be on top of it. I don't use AWB (maybe I should) but I was just looking for something I could code manually. I've since thought of
{{clarify|date=~~~~~}} (five twiddles) which gives (for example) {{clarify|date=16:40, 29 October 2013 (UTC)}}
though I suppose the TOD be objected to. Anyway, it seems like you wizards ought to be able to come up with something to avoid all this extra bot traffic. EEng (talk) 16:40, 29 October 2013 (UTC)
It will certainly be objected to, because it'll cause a redlinked category: in this case, the article will appear in Category:Wikipedia articles needing clarification from 16:40, 29 October 2013 (UTC) which is so specific that it is unlikely ever to contain more than one article. --Redrose64 (talk) 17:34, 29 October 2013 (UTC)
  • or you could use {{clarify|{{subst:DATE}}}} Werieth (talk) 17:57, 29 October 2013 (UTC)

Well, gee mister, why dintcha say so in the first place? That's the very thing needed! Works beautifully

{{clarify|{{subst:DATE}}|reason=confusing pronoun}}

becomes wikisource

{{clarify|date=October 2013|reason=confusing pronoun}}

which renders on page as

[clarification needed]

The need for all caps DATE is unfortunate, as is the need to remember to subst and not not to code date=date=October 2013 which would be a natural mistake to make. Please, no lectures on car, cdr, nil and so on, but isn't there some way to make a template selfsubsting, so we could code something like

{{clarify|{{SDATE}}|reason=confusing pronoun}}

(where SDATE means self-substing DATE, or something).

Anyway, assuming no such improvements, what would y'all think of bot edit summaries like:

Dating maintenance tags: {{Copypaste}} Next time use {{Copypaste|{{subst:DATE}}}} to automatically add date.

Along another line of thought, isn't there some way to make a template add the date on its own, by default? EEng (talk) 00:07, 30 October 2013 (UTC)

We would need to introduce a new template like {{Orfud}} Werieth (talk) 00:10, 30 October 2013 (UTC)
OK, that's a bigger project -- I don't want to get off the original topic, which is telling people a simple way to add the date. What do you think of the augmented edit summary I suggested (in bold above)? EEng (talk) 02:43, 30 October 2013 (UTC)
Why is
{{clarify|{{subst:DATE}}|reason=confusing pronoun}}
preferable to
{{subst:clarify}}
which is nine characters shorter, has no capitals, and (apart from the text of |reason=) is what I mentioned at 08:33, 29 October 2013 --Redrose64 (talk) 08:44, 30 October 2013 (UTC)

To be blunt my cognitive faculties were neutralized by your run-on paragraph of examples. Why didn't you just say:

Unfortunately dated doesn't know how to handle additional parameters e.g. {{dated|clarify|reason=whatever}}. Believe it or not {{subst:clarify}} or {{subst:clarify|reason=whatever}} supplies the date automatically and handles parameters correctly.

Being by then in an impatient frame of mind, when I saw "AWB" in Anomie's post I figured, "Oh, the technogeeks are having a coding fest again" and kind of tuned out. (Please understand when I say this that I myself am part technogeek on my father's side.)

I am taking the liberty of collapsing this side discussion, which is my fault.

Anomie's idea sounds great, though I lack the knowledge to comment on potential technical problems, and I take it this is not the forum to gain agreement on this. I'd be happy to join the discussion there. BTW, I have the recollection that at least some of the usual templates (claify, cn, ...) don't seem to understand reason= in the sense they don't show the reason text when one hovers. May I suggest that all of these types of improvement-needed templates consistently take such a parm and show it on hovering. Synonyms for reason (such as concern=, explanation= ...) might be good too. EEng (talk) 16:02, 30 October 2013 (UTC)

<bump> EEng (talk) 15:03, 4 November 2013 (UTC)
  BRFA filed Anomie 23:14, 16 November 2013 (UTC)

Help with PyWikiBot

I'm not sure if the right forum for this question, but I'm having trouble with PyWikiBot. When running any program from the Command Prompt, I get the message 'git' is not recognized as an internal or external command, operable program or batch file. Running login.py lets me log in (after displaying that message), but nothing else works.

Any help would be appreciated. My computer uses Windows 7. – Ypnypn (talk) 17:39, 19 November 2013 (UTC)

Two things the git error message is not something to worry about. Can you post the results from version.py ? Werieth (talk) 17:53, 19 November 2013 (UTC)
Pywikibot: pywikibot/__init__.py (r-1 (unknown), b3f5438, 2013/09/20, 21:0
OUTDATED)
Release version: 2.0b1
Python: 2.7.3 (default, Apr 10 2012, 23:24:47) [MSC v.1500 64 bit (AMD64)]
unicode test: okYpnypn (talk) 18:44, 19 November 2013 (UTC)
From that I would assume you are using the newer core checkout. What happens when you run pwb.py pagegenerators.py -cat:200 Werieth (talk) 18:51, 19 November 2013 (UTC)
1: 200
2: Battle of Guandu
3: Battle of Boma
4: List of state leaders in 200 — Ypnypn (talk) 18:59, 19 November 2013 (UTC)

I think I figured out the problem: The current version of PWB uses the basic import pywikibot, but the instructions on MediaWiki say said to use import wikipedia. Thanks for your help, Werieth! - Ypnypn (talk) 23:05, 19 November 2013 (UTC)

Talk page archive bots

People are encouraged to place {{Experimental archiving}} on talk pages if they used MiszaBot for archival. Wikipedia:Bots/Requests for approval/Lowercase sigmabot III 2 is growing stagnant, and such a task should be moving faster than it is. Σσς(Sigma) 23:31, 19 November 2013 (UTC)

DumbBOT is down

Wanted to let this board know that User:DumbBOT has not performed any edits since November 23. Since this bot creates and transcludes WP:RFD daily subpages, I would hope that this can be resolved ASAP. I went ahead and posted this issue on the bot's owner Tizio's talk page. Hopefully, this gets resolved soon. Steel1943 (talk) 08:11, 25 November 2013 (UTC)

I'm a volunteer for takeover if he no longer wants to run this bot.—cyberpower ChatOnline 18:31, 25 November 2013 (UTC)
Task seven is covered by lowercase sigmabot (talk · contribs). Σσς(Sigma) 08:27, 27 November 2013 (UTC)
If somebody takes over the RFD task, than it would be good if anything older than seven days would be placed in a new section called "Old discussions" (just like at WP:TFD). Armbrust The Homunculus 03:17, 1 December 2013 (UTC)
  Resolved
 – DumbBOT started running again on November 25. --Bamyers99 (talk) 17:38, 1 December 2013 (UTC)

Wikinews Importer Bot

As with all of Misza13's bots, User:Wikinews Importer Bot hasn't run since October 26th 2013. Is there a replacement in the works or another bot that could run this task? Nanonic (talk) 01:40, 9 December 2013 (UTC)

BAG Membership request

I have been nominated for BAG membership. Input is invited. The request can be found at Wikipedia:Bot Approvals Group/nominations/Cyberpower678 2.—cyberpower OnlineMerry Christmas 14:24, 22 December 2013 (UTC)

New "Draft" namespace

A new "Draft" namespace has been configured on enwiki for suitable AfDs and new articles (voluntary except for IPs). Ids: Draft - 118, Draft talk - 119. See Wikipedia:Drafts for more details. --Bamyers99 (talk) 19:31, 24 December 2013 (UTC)

Recent builds of pywikipedia have supported this change Hasteur (talk) 23:29, 24 December 2013 (UTC)
Cyberbot and Twinkle have already updates since day 1.—cyberpower OnlineMerry Christmas 23:44, 24 December 2013 (UTC)

Gerrit Patch Uploader

Just for the records: Pywikibot owners and developers may contribute there patches to the pywikibot framework without having git installed on their local computer, but using the Gerrit Patch Uploader tool. Have fun!  @xqt 15:46, 30 December 2013 (UTC)

Orphan tags to be moved to the talk namespace

The proposal was closed as having consensus to move all orphan tags to the talk namespace, including with a bot. Any bots or scripts that currently add {{orphan}} to articles should be modified accordingly. Ramaksoud2000 (Talk to me) 20:51, 19 December 2013 (UTC)

Comment BRFA filled by Magioladitis. Armbrust The Homunculus 23:28, 21 December 2013 (UTC)
Comment It appears that the AWB and Twinkle developers are likely to remove orphan tagging from their tools:
GoingBatty (talk) 23:33, 21 December 2013 (UTC)

AWB is almost ready to disactivate orphan tagging for the English Wikipedia and can also guarantee that AWB bots can add orphan tags in the correct place in the talk pages. Only some final minor changes should be done. AWB moreover, will discontinue automated orphan tagging/untagging on article space and won't auto tag on the talk pages. -- Magioladitis (talk) 23:46, 21 December 2013 (UTC)

Adminbots

Is there a list of all currently approved adminbots? WJBscribe (talk) 12:16, 6 January 2014 (UTC)

Bot for mass creation of articles/categories

Hi,
can someone recommend me a bot (software) for mass creation of articles/categories? (It's not a task for english wikipedia, so i can't post to requests). Thanks in advance. --XXN (talk) 22:32, 26 December 2013 (UTC)

Huh? Create articles? That's a technical impossibility unless you can plug a human brain into a computer.—cyberpower ChatOffline 04:04, 27 December 2013 (UTC)
It's not an impossibility - see the October 2002 listing at History of Wikipedia#Hardware and software for example. GoingBatty (talk) 04:14, 27 December 2013 (UTC)
I guess, but there's no underlying intelligence behind it. It would help to know what kind of articles are being written.—cyberpower ChatLimited Access 12:56, 27 December 2013 (UTC)
AT Sv.wikipedia.org a lot (!) of articles, disambigs and categories has been created about diffrent spicies and lakes using WP:AWB. -(tJosve05a (c) 13:54, 27 December 2013 (UTC)
Try asking Dr. Blofeld (talk · contribs). --Redrose64 (talk) 20:15, 27 December 2013 (UTC)
He has a bot for this type of tasks? XXN (talk) 13:41, 28 December 2013 (UTC)
Bot or not, just look at the number of articles created. There must be some sort of automation going on there. --Redrose64 (talk) 14:08, 28 December 2013 (UTC)
I really need to optimize that tool.—cyberpower ChatOnline 16:11, 28 December 2013 (UTC)
As XXN has already discovered, Wikipedia:CSVLoader may be helpful for mass creation of articles. GoingBatty (talk) 21:05, 28 December 2013 (UTC)
Plenty of articles here including most populated places in the US were started by bots, so there is long precedent for this and it often happened here before the deletionism problems of our recent years. You are welcome to file a request at the bot requests page - it has long been a place that EN wikipedia runs for the whole movement. But you need a link to a discussion showing you've got consensus on the wiki where the bot would run. ϢereSpielChequers 06:36, 18 January 2014 (UTC)
Exactly. Spotted this issue day before yesterday though. --Ankit Maity § (chatter)

«Contribs» 16:17, 3 January 2014 (UTC)

I've brought this to the attention of DeltaQuad. MM (Report findings) (Past espionage) 02:30, 18 January 2014 (UTC)

  Issue is now resolved. DQ didn't say that he'd sorted it but his last edit was on the 19th to DQBot and it's now working properly. MM (Report findings) (Past espionage) 00:49, 21 January 2014 (UTC)

New extension: Flow

The new Flow extension is being deployed to enwiki today. It is being deployed to only two wikiproject pages as a test run to get real users trying out the new interface constructs so they can be tweaked or completely changed based on real world usage until we arrive at a discussion system that can serve the needs of wikipedians.

Because Flow is in such an early stage, with many things uncertain, the API modules it enables are a shim exposing the internals which is sufficient only for the existing ajax calls. These will change without notice, and I encourage you to not yet build out integrations with these APIs.

We have a regular integrated MediaWiki API in the works (T59659 and others) which bots will be able to integrate with, we expect to have this merged and deployed well before expanding from our initial test runs in the wiki project space. Flow integrates with a number of MediaWiki constructs such as recent changes, watchlists, contributions, etc. Feel free to file bugs for anything those integrations might break that previously worked.

EBernhardson (WMF) (talk) 17:41, 3 February 2014 (UTC)

Flow can be enabled on any page, but in practice Flow is currently enabled on a named set of talk pages on a wiki. If Flow is enabled on a page (API needed? T62809) then it blocks most URL actions (?action=foo) on the page and implements its own; at the API level it implements its own actions, but it seems existing API actions succeed (T62808). -- S Page (WMF) (talk) 04:17, 4 February 2014 (UTC)
I created mw:Flow/Architecture/API with an overview of the API, please respond here or there with particular use cases for what you want out of the API. -- S Page (WMF) (talk) 05:52, 4 February 2014 (UTC)

Unicode

Using PyWikiBot, my program keeps on failing whenever a page contains non-ASCII characters. (Actually, it only fails when regex-searching the text or when outputting it to Command Prompt.) -- Ypnypn (talk) 19:58, 6 February 2014 (UTC)

That would be due to a problem in your python programming. Dont use the print command, (your terminal doenst support it) and second ensure that all your strings use the unicode notation u'string' and that you are not mixing them. pywikibot has its own output method for addressing the unicode issue, I am unsure of what it is at the moment but look it up. Werieth (talk) 20:03, 6 February 2014 (UTC)
Okay, thanks. -- Ypnypn (talk) 23:23, 11 February 2014 (UTC)
  Resolved

Hi. Could someone please look at this? Things are quite dead, and this need to be done soon. A number of articles are partially updated, and so is the main {{Infobox dam}}. It's a mess. There is consensus, and no objections. Can we get this going right away please? Rehman 10:21, 9 February 2014 (UTC)

You may need to {{ping}} some of the BAG members listed at WP:BAG. - Ypnypn (talk) 23:24, 11 February 2014 (UTC)
Rehman Mellow out... I've gotten home and am about to start my trial run. Hasteur (talk) 23:56, 11 February 2014 (UTC)
Cool. :] Rehman 00:03, 12 February 2014 (UTC)

Pywikibot and a blocked bot

When running a custom script with pywikibot, what's the best way to catch that the bot has been blocked if you want it to do certain things only in that case (e.g. save a log to your hard drive instead of posting it to a user subpage on-wiki)? This is in regard to the compat release, as I had trouble getting the core release to work on my computer. I tried using a try/except block to catch pywikibot.UserBlocked (based on the help text at the top of wikipedia.py), but it didn't catch it when I blocked my bot, and I'm not sure what else to try. Any help here would be appreciated. (If anyone cares, this bot is for an external wiki, but this page seems to be the place where I'm most likely to get a reasonably quick reply. Hope you don't mind the quick question; I don't plan to make asking questions here a habit.) jcgoble3 (talk) 06:53, 27 February 2014 (UTC)

Try using Site().isBlocked() from wikipedia.py.  Hazard SJ  05:36, 28 February 2014 (UTC)
OK, I'll look into that. I've also identified a possible mistake in my code that may have caused it to not catch UserBlocked properly, so fixing that might be the easier solution. I'll try both and see what works better. Thanks. jcgoble3 (talk) 01:31, 3 March 2014 (UTC)


Template talk archiving

There isn't a problem for MiszaBot I or other bots to do archiving on template talk pages, is there? Harold O'Brian (talk) 03:05, 4 March 2014 (UTC)

MiszaBots (I, II, III) are inactive since October 2013, and Lowercase sigmabot 3 has took over all task of three MiszaBots. What you need is consensus to run archive bot on the page. (See Help:Archiving a talk page#Automated archival.) — Revicomplaint? 03:51, 4 March 2014 (UTC)
So you mean a lack of consensus makes it technically malfunctioning on template talk pages? Harold O'Brian (talk) 06:51, 4 March 2014 (UTC)
No consensus = No archiving. That's all. Nothing technical, AFAIK.. Revicomplaint? 07:24, 4 March 2014 (UTC)
Lowercase sigmabot III (talk · contribs) works quite happily to archive Template talk: pages. If a given page isn't being archived, it's usually because the {{User:MiszaBot/config}} is misformatted - or is simply absent. See the various threads at User talk:MiszaBot that have been answered by myself and John of Reading (talk · contribs). --Redrose64 (talk) 07:46, 4 March 2014 (UTC)

Labs Migration

Cyberbot I and II are being migrated to new data centers in labs, and will be down for a bit. Bot ops using tool labs are encouraged to migrate at their earliest convenience.—cyberpower ChatAbsent 15:29, 5 March 2014 (UTC)

All done. Went a lot quicker than expected.—cyberpower ChatTemporarily Online 17:07, 5 March 2014 (UTC)

Bot for files description

Is there a bot which find non free files (logos; covers for movies and singles) with specified license via template, but without information/source or N-F use rationale, and it automatically add missing rationale template? //XXN (talk) 16:57, 6 March 2014 (UTC)

Sounds like my requests at Wikipedia:Media copyright questions; User talk:Toshio Yamaguchi and Wikipedia:Bot requests. --Redrose64 (talk) 17:20, 6 March 2014 (UTC)
  Resolved

This notice is to inform the people that monitor this page that a topic has been brought up on Wikipedia:Administrators' noticeboard#User talk:Hasteur#HasteurBot being naughty? that you may be interested in. — {{U|Technical 13}} (tec) 18:36, 18 March 2014 (UTC)


How can I create a new entrywith wikipedia?

How can I create a new entrywith wikipedia? — Preceding unsigned comment added by Garrynewyork (talkcontribs) 13:35, 4 April 2014 (UTC)

You have already asked this question at WP:EAR. Please do not post the same thing to multiple forums. SpinningSpark 14:34, 4 April 2014 (UTC)

New operator needed for VeblenBot and PeerReviewBot

CBM implemented and ran VeblenBot and PeerReviewBot, but is retiring from Wikipedia. I am in occasional email contact with CBM who wrote:

"It would be a good idea to find a different person to run the bot jobs. With the WMF Tools setup, I can actually just hand them the entire bot as a turnkey, they would not need to re-implement it. If you can find someone, please ask them to email me (and you email me) and I will be able to communicate with them that way."

If you are interested in taking over these bots please reply here. They are usually pretty trouble free. My email and CBM's email are both enabled.

I do the monthly PR bot maintenance (making the files and categories) and that includes adding the new PR category each month on the VeblenBot account - I would be glad to keep doing that (and give details on email).

Thanks, Ruhrfisch ><>°° 13:50, 5 April 2014 (UTC)

Logged out bot?

contributions JV Smithy (talk) 06:10, 12 April 2014 (UTC)

Somebody running Cyberbot I (talk · contribs) which is owned by Cyberpower678 (talk · contribs) --Bamyers99 (talk) 14:45, 12 April 2014 (UTC)

VoxelBot Vandalism DefCon score

Can anything be done to restore the automatic updating of the Defcon score and level, which VoxelBot until recently was doing every half-hour? Lots of counter-vandalism workers will be looking at out-of-date information on the many displays based on {{Vandalism information}} that are fed by this process: Noyster (talk), 16:11, 3 April 2014 (UTC)

Some action was taken by the bot developers over at the bot's talk page, but it broke immediately afterwards. Since then I've been doing my best to update the template manually. --k6ka (talk | contribs) 14:57, 30 April 2014 (UTC)

Deprecate pywikibot/compat

A request for comment on MediaWiki.org is seeking feedback on the possible deprecation of pywikibot/compat. If you're running that framework, you may be interested in the discussion. --Ricordisamoa 23:52, 3 May 2014 (UTC)

Logged-out archiving

Ever since late yesterday Wikipedia time, User:10.68.16.31 has been archiving discussions on assorted pages and labeling them as (BOT). I'm not sure, but I don't think bots should be doing work logged out, so I brought it here. I've got no idea which bot it is. Supernerd11 :D Firemind ^_^ Pokedex 03:27, 12 April 2014 (UTC) Sorry about being such a noob!

Somebody running ClueBot III (talk · contribs) which is owned by Cobi (talk · contribs) --Bamyers99 (talk) 14:42, 12 April 2014 (UTC)
This is happening again. --Redrose64 (talk) 19:09, 6 May 2014 (UTC)

As of June 1st 2014, Cobi's BOT is still causing issues. He is not an active moderator.

Inactive operator for bot (need relief operator)

User:Citation bot (edit | talk | history | links | watch | logs) is a much-loved fixer of citation templates, but its creator/operator is busy IRL, and has been so for quite a while. His appeal for assistance or relief in maintenance and operation of that bot has gone largely unanswered.

After the switch to Lua for the CS1 templates, a substantial rewrite was done, but some nasty bugs are not yet dealt with. The op has barely been onwiki and hasn't touched the code in eight weeks. Even code reviews would be a big help, and if someone can find corrections, so much the better. I've tried but my brain won't wrap itself around the language used (php).

The code is open, available at http://code.google.com/p/citation-bot/ for anyone considering helping out. LeadSongDog come howl! 15:31, 7 May 2014 (UTC)