Wikipedia:Bots/Requests for approval

New to bots on Wikipedia? Read these primers!

If you want to run a bot on the English Wikipedia, you must first get it approved. To do so, follow the instructions below to add a request. If you are not familiar with programming it may be a good idea to ask someone else to run a bot for you, rather than running your own.

 Instructions for bot operators

Current requests for approval

Operator: Urban Versis 32 (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 18:17, Saturday, July 15, 2023 (UTC)

Automatic, Supervised, or Manual: supervised

Programming language(s): Python (Pywikibot)

Source code available: Main repository for UrbanBot's code Source code file for task

Function overview: UrbanBot's task is to mass-add short descriptions to pages that don't have one.

Links to relevant discussions (where appropriate): Original discussion at village pump Wikidata discussion for bot task

Edit period(s): Runs whenever the bot operator runs the script

Estimated number of pages affected: Any page lacking a short description may be edited by UrbanBot. This is not to say it will try to add short descriptions to every page lacking one.

Exclusion compliant (Yes/No): Yes

Already has a bot flag (Yes/No): No

Function details: 1. The bot operator will first enter a category name from the English Wikipedia. This category will be used to group pages lacking a short description which will all have the same short description added to them.

2. The bot operator will enter the short description to be added to the pages in the Wikipedia category.

3. The code will check the short description entered to ensure that it does not exceed the character limit.

4. The bot will follow through these steps for each page:

4a. The bot will check if the page already has a short description or a template-applied short description.

4b. If the Wikipedia page does not already have a short description, the bot will write the short description specified by the bot operator in step 2 into the item.

4c. The bot will loop through to the next page in the category and run all steps in step 4 again until every page has been scanned.

5. The bot will output statistics on the number of pages scanned, number edited, etc.

Note: The bot was originally submitted at Wikidata and was designed to edit Wikidata per this village pump discussion, but upon reviewing the Wikiproject Short Descriptions page and receiving feedback and information at Wikidata it's evident that for the intended task the bot should be based at Wikipedia and edit Wikipedia short descriptions rather than Wikidata descriptions.

Note 2: Running the bot through limited testing for the code.


Can you give some examples where it's helpful for all pages in the category to have the same short description? In such cases, it would probably be more useful to apply the shortdesc via a template instead. – SD0001 (talk) 06:55, 16 July 2023 (UTC)Reply[reply]
One such category would be Category:Linux distributions. I ran UrbanBot through this category to test the code when UrbanBot was still editing Wikidata. UrbanBot applied the following Wikidata description to articles in the category that did not have a short description or Wikidata description: "Linux distribution". This sort of thing works well when all pages in a category share the same main property, such as all being a Linux distribution. However, other categories wouldn't be as good for this, such as Category:Alumni of Lancaster University, when the only thing the pages in this category have in common was that they were all alumni of Lancaster University. This isn't what these people are known for though, evident by a variety of existing Short descriptions in this category such as "Danish sociologist" and "Irish politician". Urban Versis 32KB(talk / contribs) 16:40, 16 July 2023 (UTC)Reply[reply]
How will the bot or bot operator ensure that the category assignment is correct? Honor (brand) is in Category:Linux distributions, but it does not appear to be a Linux distribution. Also TurnKey Linux Virtual Appliance Library, which is in that category but does not appear to fit the proposed SD. – Jonesey95 (talk) 03:22, 17 July 2023 (UTC)Reply[reply]
That's why the bot is classified as supervised; I will look at the edit history of the bot and make sure each article is a Linux distro, in this case. Also, if a page that's not a Linux distribution is in the category for Linux distributions, then there's not reason for it to be in that category IMO. Urban Versis 32KB(talk / contribs) 15:02, 17 July 2023 (UTC)Reply[reply]

In step 1 or step 4a, how does the bot determine if there is a short description that has been assigned by a template? Will the bot apply manual short descriptions to override template-based SDs? If so, why, or under what conditions? – Jonesey95 (talk) 03:25, 17 July 2023 (UTC)Reply[reply]

Good point. I have fixed this to where the bot will also check if there is a template-applied short description on the page, and if so, to not override the SD as it would have before. Urban Versis 32KB(talk / contribs) 15:22, 17 July 2023 (UTC)Reply[reply]

According to Wikipedia:WikiProject Short descriptions § State of the project there are over 1 million pages that are lacking shortdescs. Are you seriously saying (based on the "supervised" nature of this task) that you and your bot are going to add 1 million shortdescs? Primefac (talk) 16:09, 6 August 2023 (UTC)Reply[reply]

No. If you're referring to the fact that I put "any page lacking a short description" to the estimated number of pages affected, I was trying to explain that any page lacking an SD could potentially be modified by UrbanBot. I wasn't saying that UrbanBot would try to add an SD to every single page lacking one. Urban Versis 32KB(talk / contribs) 20:57, 6 August 2023 (UTC)Reply[reply]
A slightly more well-defined scope would be appreciated. Feel free to amend the main proposal directly. Primefac (talk) 07:42, 8 August 2023 (UTC)Reply[reply]
It seems to me that the intended scope is "pages lacking short descriptions in manually selected categories". Difficult to put a number on that. casualdejekyll 02:38, 17 September 2023 (UTC)Reply[reply]
...I had a brainfart and for some reason thought I was replying to a week old comment and not a month old comment. Oops. casualdejekyll 02:38, 17 September 2023 (UTC)Reply[reply]
@Casualdejekyll Again, there is no defined number of pages affected by UrbanBot. Any of the million or so pages could possibly be affected by UrbanBot, but not all million pages will be affected. I am just trying to help reduce the number of pages lacking a short description. (Also I know that this page hasn't gotten any attention for the past month, I'm still waiting for an approval.) Urban Versis 32KB(talk / contribs) 15:43, 17 September 2023 (UTC)Reply[reply]

Bots in a trial period

Operator: Mr. Stradivarius (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 13:39, Wednesday, September 27, 2023 (UTC)

Function overview: Automatically populate Module:Disambiguation/templates with a list of disambiguation templates and their redirects.

Automatic, Supervised, or Manual: Automatic

Programming language(s): Python (Pywikibot framework)

Source code available:

Links to relevant discussions (where appropriate): Module talk:Disambiguation#Bot for updating template list

Edit period(s): Daily

Estimated number of pages affected: 1

Namespace(s): The Module namespace

Exclusion compliant (Yes/No): Yes

Adminbot (Yes/No): Yes

Function details: The bot iterates through all templates in Category:Disambiguation message boxes. Non-templates are ignored, as are templates in the bot's exclusion list (currently Template:Dmbox is ignored). The bot creates a list of all of these templates, and all of their redirects, formats it as a Lua table, and saves the result at Module:Disambiguation/templates. Saving is skipped if the module's content would not change. I have saved sample output from the bot here. Module:Disambiguation/templates is fully protected as it is used in Module:Disambiguation, which is currently transcluded on roughly 15.7 million pages, so the bot needs permission to edit protected pages.


This should ideally be fixed in MediaWiki itself (phab:T71441). But something is better than nothing. Please notify WP:AN as well per WP:ADMINBOT.   Approved for trial (1 week). Please provide a link to the relevant contributions and/or diffs when the trial is complete.SD0001 (talk) 18:38, 27 September 2023 (UTC)Reply[reply]
WP:AN notified here. — Mr. Stradivarius ♪ talk ♪ 00:38, 28 September 2023 (UTC)Reply[reply]

This seems like serious overkill. New templates only will be added very occasionally. Wouldn't it be better to have a bot edit from time to time a separate list of all these templates, and if and when there are changes, let an admin update the module page manually (while checking that no one has added or removed a template by mistake or maliciously at the same time, something a bot won't do)? Granting admin status for a bot that will only really need to make a change once every month or so is not a good idea IMO (benefit-risk balancewise). Fram (talk) 08:44, 28 September 2023 (UTC)Reply[reply]

@Fram: The likely outcome of making this a manual process is that the module will be rarely updated, if ever. This means it would be inaccurate for long periods of time after templates are added or removed. On the other hand, if templates are added or removed from the category by mistake or maliciously, then presumably someone will notice this and revert the change. If the module is updated automatically, then in this situation it would be inaccurate for only a short amount of time (maybe a day or two, or no days at all if the change is reverted quickly enough). Using a bot to update the module seems like a better choice from this perspective. — Mr. Stradivarius ♪ talk ♪ 09:56, 28 September 2023 (UTC)Reply[reply]
Why would it be swiftly noticed if someone removes a template from the list by mistake, but not if a template should be on it but isn't? And if the bot writes to a separate page and some admins put this on their watchlist, it is easy to see when a change is expected (page appears on watchlist) and can then be executed with care. Admin bots should be very rare (e.g. the proxy blocking one), and having one for this rarely necessary task (it will run once a day, but how often will it actually need to edit?) doesn't seem sensible. What is the effect of a disambig template not being in that list anyway? When a page is added to that list (manually or by bot), what is the effect vs. if it isn't on that list? Fram (talk) 10:06, 28 September 2023 (UTC)Reply[reply]

Operator: Aidan9382 (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 22:02, Monday, September 4, 2023 (UTC)

Function overview: Automatically move subpages left behind (orphaned) after moves of a parent page

Automatic, Supervised, or Manual: Automatic

Programming language(s): Python

Source code available: The exact task script is not yet made, but will be derived off of this existing task

Links to relevant discussions (where appropriate):

Edit period(s): Twice per day or so

Estimated number of pages affected: 0-4 pages a day

Namespace(s): Just Talk:

Exclusion compliant (Yes/No): Yes, including on subpages intended to be moved (none will be moved if any have exclusion)

Function details: The bot would watch Special:Log/move for page moves, and if it finds a page move which has lead to the orphaning of subpages, it'll keep watch on it. After some amount of time to avoid participating in a move war/revert (7 days or so), if the subpages are still orphaned, and if it's sure all the subpages can be moved without issue, and assuming there's nothing else that's happened which could make it non-trivial, the bot would automatically move the subpages to under the new title, as well as adjusting any archiving related templates on the parent page ({{User:HBC Archive Indexerbot/OptIn}}, {{User:MiszaBot/config}}, {{User:ClueBot III/ArchiveThis}}).

This task is kind of like an expanded scope of my currently approved task, which does basically the same thing, but only for pages using {{User:MiszaBot/config}}. I've been running a userspace report to track moves which caused orphaned subpages here, which gives an idea about how often this happens. Aidan9382 (talk) 22:02, 4 September 2023 (UTC)Reply[reply]


  Approved for trial (50 edits or 14 days, whichever happens first). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Primefac (talk) 10:21, 12 September 2023 (UTC)Reply[reply]

@Primefac: Quick question: Should I count each page fixed as an "edit" or count each subpage move as 1 edit and the base page edit to fix template as 1 edit? (I assume the latter). I intend to run it on the already existing list of pages, so I fully expect to hit the edit count before the day count (once the script is made). Aidan9382 (talk) 11:05, 12 September 2023 (UTC)Reply[reply]
Each move. Primefac (talk) 11:20, 12 September 2023 (UTC)Reply[reply]

Operator: HarshaMadhyastha (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 16:18, Tuesday, June 6, 2023 (UTC)

Automatic, Supervised, or Manual: automatic

Programming language(s): Python

Source code available: Plan to open-source, but not yet ready for release

Function overview: For every broken external reference in any English Wikipedia article, the bot will check if the page previously available at that link still exists on the web at an alternate URL. If successful, the bot will patch the reference to point to the new URL.

Links to relevant discussions (where appropriate):

Edit period(s): Manually start a new run once every few months

Estimated number of pages affected: All articles linked from

Exclusion compliant (Yes/No): Yes

Already has a bot flag (Yes/No): No

Function details: 1. The bot will iterate over every article linked from and scrape all external links in those articles.

2. For each link tagged "permanent dead link", the bot will attempt to find the new URL of the same page that previously existed at the now broken link. More details regarding the techniques used are at

3. If the new URL for the linked page is found, the bot will replace the "permanent dead" link with the new URL. The new URL identified by the bot is expected to be wrong about 10% of the time (as per the statistics from So, as suggested in the discussion at, for every link that it replaces, the bot will leave a "verification needed" tag.


  • 10% is a very high error rate, and that category is pretty big—a lot of those tags won’t get looked at for a while, and citations with incorrect URLs are extremely damaging. I think this needs to be semi-automated. Really cool idea by the way. Snowmanonahoe (talk · contribs · typos) 02:06, 9 June 2023 (UTC)Reply[reply]
    Completely agree that having citations point to incorrect URLs is very bad, and we very much welcome any alternate proposals for how our bot should edit pages to include the new URLs it finds.
    Having us manually vet every new URL that the bot finds is not going to scale. So, here's an alternative that we are considering: instead of *replacing* a permanently dead link with the new URL we find, what if we *augmented* the citation to include a link to the new URL? This is similar to how the InternetArchiveBot works, which adds a link to an archived copy for any particular link that it finds to be dead, while still leaving the original link in place. Thoughts, or proposals for alternatives?
    The error rate of 10% is also the reason why we plan to focus specifically on permanently dead links, because for all of these links currently there is no way for a user to access the linked content: the original link does not work, and there is no archived copy. So, even though 10% of the new links will be wrong, fixing 90% of permanently dead citations is perhaps better than leaving 100% of them as broken? HarshaMadhyastha (talk) 17:16, 9 June 2023 (UTC)Reply[reply]
    The augmenting thing is not really the same, because archives are 100% deterministically proven to be the correct URL. We have both URLs not because of a chance the archive url is wrong, but because it's helpful for readers to see what the original location of the website is, unless the website has been usurped (or in some cases, the site is still up, and an archive is just there).
    Even though it would fix 90% of permanently dead citations, that's great and all, but correct citations are the standard, not the goal. 10% of 178,345 is 17,835 citations that will point to the wrong webpage. If it points to the wrong live web page, which as far as I understand always happens, the result is a citation that doesn't really support what it's next to, confusing the reader and making the statement or possibly the whole article seem like nonsense, which is much worse than simply not having a citation, or having an inaccessible one.
    My idea is to do something like IABot's OAuth functionality, where you can have it run on certain pages. Even better would be if you could incorporate this stuff into IABot's existing tool, but I understand if that isn't feasible. Snowmanonahoe (talk · contribs · typos) 17:35, 9 June 2023 (UTC)Reply[reply]
    "My idea is to do something like IABot's OAuth functionality, where you can have it run on certain pages."
    I'm not very familiar with this aspect of IABot. Can you please elaborate? Perhaps what you are proposing is that we design our bot such that a human editor of any page can specifically request for our bot to run on that page and see if it finds any replacements for the permanently dead links on the page, and the editor can then manually verify our edits?
    Also, to clarify my proposal regarding augmenting links, I am envisioning that the format we would use for the new link would make it explicit that it is a prediction for the new location of the linked page, so that the user is aware that this alternate link could potentially be incorrect. But, since there is no convention for such links currently on Wikipedia, I realize this would be a big change. HarshaMadhyastha (talk) 19:13, 9 June 2023 (UTC)Reply[reply]
    [1] Snowmanonahoe (talk · contribs · typos) 19:17, 9 June 2023 (UTC)Reply[reply]
    Thank you for the pointer. We'll take a look. HarshaMadhyastha (talk) 19:44, 9 June 2023 (UTC)Reply[reply]
    Having reviewed how IABot's OAuth functionality works, we can certainly implement FABLEBot to work in a similar manner. But, before I recruit a student to work on this implementation, how can we get confirmation that a bot in this form is indeed what's desired and is likely to be approved for operation? Should I modify the bot's function details above? Or, should I submit a new request for approval? HarshaMadhyastha (talk) 21:57, 19 June 2023 (UTC)Reply[reply]
  • GreenC, you're my go-to person for archive-related stuff: does this seem like a reasonable alternate option to supplement the existing archive bots (who are the primary adders of the "permanently dead" tags)?. Primefac (talk) 09:16, 14 June 2023 (UTC)Reply[reply]
    Hi Primefac, thanks for checking. I am familiar with the University of Michigan group. They have done some remarkable work in this area. We have corresponded in the past. I trust them to do good work. I agree a 10% error rate is too high for a fully automatic bot. However, it won't be "10% of 178,345 is 17,835 citations" because many of them won't have a new target, in fact most won't.
    In addition to the ideas above, I might suggest doing a "dry run" on about 3,000 pages and instead of saving the page, log the results to see how many would have been modified and extrapolate what the absolute number of bad links would be. From that it might be possible to determine how reasonable it would be to manually check every new archive. Processes can be created that make manual checking easier such as loading 50 pages into tabs then closing each tab that is right and "bookmark all tabs" that are wrong, the save the bookmark to a file which can be used to create a DB of inaccurate links which the bot can access.
    Another idea is to save the results to the talk page, IABot uses to do this when it first started.
    As for an OAuth application this page is a starting point. GreenC 13:30, 14 June 2023 (UTC)Reply[reply]
    @GreenC, thank you for the vote of confidence.
    Sometime last year, we did an extensive analysis of a dataset of a few hundred thousands links which have no archived copies. We found that we were able to find the new URL for roughly 10% of these links.
    So, if we consider that we found roughly 300K links tagged as permanently dead on enwiki last year, we'd expect that our bot would rewrite around 30,000 of these links.
    When we first began thinking about this bot last year, we proposed starting off by posting the proposed URL replacements in talk pages. But, beyond doing this on a few hundred pages, the feedback was against us doing this. HarshaMadhyastha (talk) 19:50, 19 June 2023 (UTC)Reply[reply]
    It should be possible to design a tool or page that makes manual checking of links fast and easy. It doesn't edit Wikipedia, it only displays URLs and queries users if those URLs are good or not. Maybe it displays 5 URLs at a time, with a radio-button next to each for Keep or Delete then a "Save" button at the bottom, at which point it loads 5 more URLs for checking. Once the data is saved, a separate bot can update Wikipedia in batches. My bot can do this: I recommend using my bot for this part because moving URLs is a lot more complex then it seems, there is a lot to it, archive URLs, templates like webarchive, bare URLs, named references which contain URLs, etc.. I have developed the codebase for this from years of experience with this kind of work. All that would be required is three data points: the pagename, oldurl and newurl. User:HarshaMadhyastha, do you have anyone with for example Python or JS or PhP skills that could build a tool like this on Toolforge? I can provide links where to get started. -- GreenC 21:34, 20 June 2023 (UTC)Reply[reply]
    This sounds like a great plan, @GreenC. The data that you list -- page name, old URL, and new URL -- matches what we had on the page that we compiled last year for assessing FABLE's accuracy. So, we definitely are able to produce the identified URL replacements in this form.
    Let me talk to my current students to see if any of them is up for implementing the tool that you describe. Even otherwise, I'm sure I'll be able to recruit a new student for this. So, if you can please provide pointers on the relevant documentation, that would be very helpful for the student to know how to get started. Thanks! HarshaMadhyastha (talk) 01:50, 21 June 2023 (UTC)Reply[reply]
    Hi sorry for the late reply! The first step is create a Toolforge account [2]. If you are using Python, I found this tutorial to be reliable and easy guide to create a new tool: [3]. You won't need OAuth so step 4 can be skipped. There are some other tutorials for other languages in the right column. GreenC 20:29, 3 July 2023 (UTC)Reply[reply]
    Thank you for the pointers, @GreenC! One of my students has started working on setting things up on Toolforge. I have asked him to follow up with you here if/when he has any questions. HarshaMadhyastha (talk) 22:15, 7 July 2023 (UTC)Reply[reply]
    Hi @GreenC! I'm the student assisting Harsha on this. One question I have is relating to how we'll end up storing the aliases we've found. On ToolForge, is there some MySQL database we can use? Anishnya (talk) 03:20, 13 July 2023 (UTC)Reply[reply]
    Hi Anishnya - When you sign up for a Toolforge account that should be included: -- GreenC 13:48, 13 July 2023 (UTC)Reply[reply]

  Approved for trial. Please provide a link to the relevant contributions and/or diffs when the trial is complete. Per the discussion with GreenC above, I would like to see a "dry run" with a log (which can be placed in a subpage depending on size) so that a more accurate assessment of the error rate can be determined for this task. Primefac (talk) 16:11, 6 August 2023 (UTC)Reply[reply]

User:Primefac I don't think this is going to be a bot, rather a Tool that users interact with that saves data to a database. This data will then be used by GreenC bot to update Wikipedia, which my bot already has approval for. This BOTREQ as such might be redundant? We might need to create a separate project page somewhere to coordinate. -- GreenC 16:42, 6 August 2023 (UTC)Reply[reply]
@GreenC's assessment is right. We are no longer developing a bot that will be editing pages. As per the discussion above, we are currently developing a dashboard which will enable users to provide feedback on our proposed URL replacements. We will log the ones deemed to be correct, and GreenC will then use his bot to make those edits. HarshaMadhyastha (talk) 21:51, 9 August 2023 (UTC)Reply[reply]
@HarshaMadhyastha: May I suggest we create a project page for discussion and coordination? One idea is WP:Link rot/FABLE, or in user space like User:HarshaMadhyastha/FABLE. I like the former as it may eventually become a documentation page linked from WP:Link rot. -- GreenC 00:56, 10 August 2023 (UTC)Reply[reply]
That sounds good to me. Thank you, @GreenC. Once we have completed a first version of our dashboard (which should be soon), I'll create the page at WP:Link rot/FABLE, post an update there, and tag you on it. HarshaMadhyastha (talk) 02:10, 16 August 2023 (UTC)Reply[reply]

Operator: Capsulecap (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 00:14, Wednesday, June 14, 2023 (UTC)

Function overview: This task checks the Top 25 Report page frequently to see if the current report has updated. If it was updated, then it will go through all pages in the new report and add or update the Template:Top 25 Report template on their talk pages.

Automatic, Supervised, or Manual: Automatic

Programming language(s): Python

Source code available: No, but if necessary I can upload it

Links to relevant discussions (where appropriate): Wikipedia:Bot requests#Top 25 report

Edit period(s): Daily

Estimated number of pages affected: 25 pages/week

Namespace(s): Talk

Exclusion compliant (Yes/No): No

Function details: This task first checks the page Wikipedia:Top 25 Report to see if the transcluded link was modified. (This should mean that the report was updated.) If it has, then it uses the first revision of the transcluded page, which is always a basic list, to get a list of article talk pages to modify. It then goes through each talk page, updating the Template:Top 25 Report template if it exists and adding it if not. As for exclusion compliance, I have not added that feature in yet.


The Top 25 report is updated weekly. Why does this task need to run twice a day? Primefac (talk) 09:08, 14 June 2023 (UTC)Reply[reply]

I wanted to ensure that the template is added quickly. I've changed it to daily, and if it should be longer then you can tell me. Capsulecap (talkcontribs) 14:24, 14 June 2023 (UTC)Reply[reply]
Additionally, some reports (including the one for last week) are finished late, and do not get added until later on. I wanted to ensure that the pages on the report get the template on their talk page. If the next report is done on time, then the maintainers of the report will replace the transclusion to the late report with the new one less than a week after the old report replaced the one before it. I agree that twice a day was a bit too excessive. Daily should be fine. Capsulecap (talkcontribs) 14:28, 14 June 2023 (UTC)Reply[reply]
Capsulecap is right about this. And task need to run twice a day.--BabbaQ (talk) 15:58, 14 June 2023 (UTC)Reply[reply]
  • @Capsulecap: Hi. What would happen if the same article comes in top 25 report again, say with a gap of four months? —usernamekiran (talk) 17:21, 21 June 2023 (UTC)Reply[reply]
    If that happens, then there will be no difference from if it was featured twice with more than a four month gap. There is nothing that says to do anything different for pages on T25 which are featured multiple times in a small timespan, and pages like Talk:ChatGPT feature multiple such examples. Capsulecap (talkcontribs) 23:56, 21 June 2023 (UTC)Reply[reply]

  Approved for trial (1 days). Please provide a link to the relevant contributions and/or diffs when the trial is complete. I'm trying to wrap my head around what's this bot supposed to do exactly, so I'm going to approve it for a one-time run of 1 day. This should give me (and perhaps others) a better idea of what this is about. Headbomb {t · c · p · b} 17:34, 2 July 2023 (UTC)Reply[reply]

@Headbomb: Although I did a trial run, the bot made test edits with numerous errors. I have fixed the code causing these issues, and will (with permission) restart the trial when the next report comes in. Capsulecap (talkcontribs) 19:51, 3 July 2023 (UTC)Reply[reply]
@Capsulecap: can you link to the results nonetheless? Headbomb {t · c · p · b} 21:46, 3 July 2023 (UTC)Reply[reply]
See edits 4 through 29. Note that the newest three edits were a test run for a fix to something which happened in Talk:Elemental (2023 film), and that many incorrect edits were caused by other editors modifying talk pages to add the template before the test run was done. Although the bot will not add redundant templates assuming that nobody adds the top 25 placement before it, I am considering adding redundancy protection. One problem — the one on the page about the Titan submarine incident — was one I didn't think of, as the talk page was moved with the main page, causing the top 25 report template to be placed on a redirect instead of the actual talk page. This is a problem I am working on fixing, as I have noticed that "current events" pages that show up on the report often frequently get moved. The bot also ended up creating the page "Talk:Errible things in Russia, the North Atlantic and HBO have the most attention this week.", but I fixed the source issue and tagged the page for CSD. few of the edits are fine, and most would be fine if there was redundancy protection or if the top 25 templates didn't already have the week in there. One question, though — since the bot will run daily, and people wouldn't need to modify top 25 templates anymore — should I implement redundancy protection? Capsulecap (talkcontribs) 02:46, 4 July 2023 (UTC)Reply[reply]
"Should I implement redundancy protection" I would say that's a good idea, regardless of how often it comes into play. Headbomb {t · c · p · b} 02:52, 4 July 2023 (UTC)Reply[reply]
I just finished implementing the redundancy protection along with the redirect traversal stuff. The bot should work just fine now. Do I have to redo the trial? Capsulecap (talkcontribs) 04:35, 4 July 2023 (UTC)Reply[reply]
  Trial complete. See 21 most recent contributions. Out of the 25 pages in the June 25th to July 1st edition, 21 pages were correctly edited, two pages (Talk:Money in the Bank (2023) and Talk:Titan submersible implosion) were not edited because of unexpected and likely erroneous formatting in the report's first revision (a space was in place of the usual tab after those two pages' titles), and two pages were not edited as they already had this week in their templates. For context on those two pages which didn't get the template on accident, the first revision of the report is always an imported set of tab delimited data — in this case, spaces were in place of tabs for the names of those two articles. The bot created two new talk pages on accident, which I quickly tagged for CSD. Capsulecap (talkcontribs) 05:48, 7 July 2023 (UTC)Reply[reply]
Update: I've come up with a solution to this problem and will be implementing and testing it soon. This is the last issue which I will have to fix. Capsulecap (talkcontribs) 16:26, 7 July 2023 (UTC)Reply[reply]

  Approved for extended trial (25 edits or 7 days). Please provide a link to the relevant contributions and/or diffs when the trial is complete. One week's worth, or 25 edits, whichever you need. Headbomb {t · c · p · b} 17:16, 7 July 2023 (UTC)Reply[reply]

  Trial complete. See See 25 most recent contributions. This time, I verified that all edits the bot would make would be correct on a script that had editing commented out. They were all good edits, so I ran the full script. All 25 pages on the report had the template added or changed on their talk pages. Capsulecap (talkcontribs) 01:57, 15 July 2023 (UTC)Reply[reply]
Most seemed fine, but there was this that stood out.
Headbomb {t · c · p · b} 21:59, 20 July 2023 (UTC)Reply[reply]
I noticed that and didn't pay much attention to it as it was merely cosmetic. Since that was considered problematic, I'll get to fixing that and keeping the collapse as the last edit. Capsulecap (talkcontribs) 14:57, 21 July 2023 (UTC)Reply[reply]
For testing you can revert to a prior state and unleash the bot on it. Headbomb {t · c · p · b} 16:17, 21 July 2023 (UTC)Reply[reply]
  Trial complete. See 22 most recent edits. Also see this test edit which the bot made in user talk space showing a similar condition to the page Talk:Deaths in 2023. If you would like, I can manually revert the edit on Talk:Deaths in 2023 which added the newest date and run the bot again to show you. Capsulecap (talkcontribs) 19:58, 21 July 2023 (UTC)Reply[reply]
Well... the collapsed stuff is handled correctly, but now it's inconsistent the other way around. It should list the ranks when they're there, or omit them when they're not.
Or, probably a better idea, update old listings to list the ranks, e.g. [4]. You might need some discussion before though. Headbomb {t · c · p · b} 20:18, 21 July 2023 (UTC)Reply[reply]
I think it's a good idea to retroactively add the rankings to the templates, but I'm not sure of where to obtain consensus for that, and it would either require a bot task or lots of manual work. The other way you listed is probably easier, but causes inconsistency between pages. Something else I thought of is a Lua module that automatically grabs the placements, but I'm not sure if such a thing is supported. Capsulecap (talkcontribs) 20:30, 21 July 2023 (UTC)Reply[reply]
What if it deleted what was there first, then re-added the template with all dates and ranks? In the same edit that is. Headbomb {t · c · p · b} 20:46, 21 July 2023 (UTC)Reply[reply]
It could work, but I think I would have to submit a separate bot task for that. A separate (and much simpler) approach would be to add a "ranks" parameter that does nothing to the bot category. If set to yes, then the bot will add ranks when it updates the report. Otherwise or if unset, the bot will only add the date. This maintains consistency within talk pages, but not between talk pages; the latter would require consensus strongly towards either using ranks or not. Capsulecap (talkcontribs) 21:03, 21 July 2023 (UTC)Reply[reply]
  Approved for extended trial (25 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Indeed, consistency within talk pages is usually a lesser threshold to clear. I'm giving you trial for that (make sure to include a mix of both types of edits), but if you want to have that (should we always rank things) discussion first, you can also wait for consensus to emerge before trialing. Headbomb {t · c · p · b} 21:11, 21 July 2023 (UTC)Reply[reply]

Operator: Sohom Datta (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 02:43, Tuesday, May 30, 2023 (UTC)

Automatic, Supervised, or Manual: supervised

Programming language(s): NodeJS + mwn

Source code available: TBD (will publish in a dedicated subpage/on github)

Function overview: Adding Navboxes to pages corresponding to Indian villages

Links to relevant discussions (where appropriate): Expected to be uncontroversial

Edit period(s): one time run

Estimated number of pages affected: ~3300 ++

Exclusion compliant (Yes/No): No (Will/Can respect any variation of {{nobots|deny=AWB}} if required)

Already has a bot flag (Yes/No): No

Function details:

- Finding all instances of articles inside of Category:Villages in India by district.

- Filtering articles that do not have a Navbox corresponding to their district. (The heuristics I used to get to the 3300 number is by checking if a template with the name of the district existed in the page)

- Adding appropriate navbox related to the district to which the village belongs.


Could you please give an example or two of an edit the bot would be performing? (please do not ping on reply) Primefac (talk) 08:36, 7 June 2023 (UTC)Reply[reply]

Something similar to 1156232688 and 1156210338. Sohom (talk) 18:36, 7 June 2023 (UTC)Reply[reply]
@Sohom Datta: Could you please help me understand 1156232688? It appears Yermal is not included in {{Settlements in Udupi district}}. Thanks! GoingBatty (talk) 22:45, 12 June 2023 (UTC)Reply[reply]
@GoingBatty Yermal does show up in Category:Villages in Udupi district and benefits from being linked to a bunch of other articles via the Navbox. That being said, I did not see that it was not linked in the Navbox, and it maybe we can expand the scope to add the article to the navbox as well ? Sohom (talk) 00:42, 13 June 2023 (UTC)Reply[reply]
@Sohom Datta: If it is appropriate to add the article to the navbox, then it is appropriate to add the navbox to the article, per WP:BIDIRECTIONAL. GoingBatty (talk) 01:05, 13 June 2023 (UTC)Reply[reply]

  Approved for trial (50 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Primefac (talk) 09:37, 14 June 2023 (UTC)Reply[reply]

Noting here, I'll be traveling till the end of June, will run the trial once I'm back. Sohom (talk) 02:57, 25 June 2023 (UTC)Reply[reply]
@Sohom Datta any update? — Qwerfjkltalk 17:48, 26 September 2023 (UTC)Reply[reply]
This appears to have fallen off my radar, will run the trial in a bit Sohom (talk) 14:07, 27 September 2023 (UTC)Reply[reply]

Operator: Philroc (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 11:03, Thursday, May 4, 2023 (UTC)

Automatic, Supervised, or Manual: Automatic

Programming language(s): Python

Source code available: [5]

Function overview: Update various Billboard chart articles to reflect current number-one songs and albums.

Links to relevant discussions (where appropriate):

Edit period(s): Once per week

Estimated number of pages affected: 38

Exclusion compliant (Yes/No): Yes

Already has a bot flag (Yes/No): No

Function details: The bot will extract the title and artist of each chart's current number-one song/album from the official Billboard website, combine them into a wiki-friendly format and insert the final product into the "current number-one" statement found in the chart's corresponding article.


  Approved for trial (21 days). Please provide a link to the relevant contributions and/or diffs when the trial is complete. In other words, three full updates. Primefac (talk) 08:43, 7 June 2023 (UTC)Reply[reply]

Operator: Harej (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 22:38, Sunday, May 7, 2023 (UTC)

Function overview: Generates reports and alert lists for source usage. Initially for the Vaccine safety project but with plans to support future WikiProjects.

Automatic, Supervised, or Manual: Automatic

Programming language(s): Python

Source code available: Under development on GitHub

Links to relevant discussions (where appropriate): Wikipedia talk:Vaccine safety#Ongoing overhaul of Wikipedia:Vaccine safety/Sources (note that this bot only edits in pages specifically relevant to the report and not really being edited by other people)

Edit period(s): Daily

Estimated number of pages affected: about 2 project pages per subscribed WikiProject

Namespace(s): Project

Exclusion compliant (Yes/No): not applicable (bot only edits its own pages)

Function details:

  • Scans perennial sources tables such as Wikipedia:Vaccine safety/Perennial sources and uses of external links in articles, according to a pre-defined set of pages.
  • Prepares reports of frequent usage of unrecognized domains in articles, as well as usages of "flagged" domains in articles. "Flagged" means the article is known to be of poor or mixed reliability. There will be other reports in the future. Example: Wikipedia:Vaccine safety/Reports
  • Prepares alerts based on these reports. Alerts are summaries of new changes to the report, like a notification. Example: Wikipedia:Vaccine safety/Alerts

Harej (talk) 22:38, 7 May 2023 (UTC)Reply[reply]


{{BAG assistance needed}} — Preceding unsigned comment added by Harej (talkcontribs) 00:10, 14 May 2023 (UTC)Reply[reply]

  • non-bag comment: the task seems to be non disruptive, and helpful. I don't see any issues with giving it a trial, especially given Harej's credibility. —usernamekiran (talk) 22:51, 27 May 2023 (UTC)Reply[reply]

  Approved for trial (50 edits or 28 days, whichever happens first). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Primefac (talk) 08:39, 7 June 2023 (UTC)Reply[reply]

Primefac, I am just now seeing this, can I restart the clock on the trial? Harej (talk) 17:48, 17 July 2023 (UTC)Reply[reply]
It hasn't started yet if you haven't started yet. Primefac (talk) 08:31, 18 July 2023 (UTC)Reply[reply]

Operator: Hawkeye7 (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 01:57, Wednesday, March 22, 2023 (UTC)

Function overview: Mark unassessed stub articles as stubs

Automatic, Supervised, or Manual: Automatic

Programming language(s): C#

Source code available: Not yet

Links to relevant discussions (where appropriate): Wikipedia:Bot requests/Archive 84#Stub assessments with ORES

Edit period(s): daily

Estimated number of pages affected: < 100 per day

Namespace(s): Talk

Exclusion compliant (Yes/No): Yes

Function details: Go through Category:Unassessed articles (only deals with articles already tagged as belonging to a project). If an unassessed article is rated as a stub by ORES, tag the article as a stub. Example


  •   Note: This bot appears to have edited since this BRFA was filed. Bots may not edit outside their own or their operator's userspace unless approved or approved for trial. AnomieBOT 00:10, 28 March 2023 (UTC)Reply[reply]
    ^. Also, may potentially be a CONTEXTBOT; see Wikipedia:Stub: There is no set size at which an article stops being a stub. EpicPupper (talk) 23:04, 30 March 2023 (UTC)Reply[reply]
    The Bot run only affects unassessed articles rated as stubs by mw:ORES. The ORES ratings for stubs are very reliable (some false negatives – which wouldn't be touched under this proposal – but no false positives). Hawkeye7 (discuss) 00:03, 31 March 2023 (UTC)Reply[reply]
  •   Approved for trial (50 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Sounds reasonable as ORES is usually good for assessing stub articles as such. – SD0001 (talk) 11:41, 1 April 2023 (UTC)Reply[reply]
     Y Bot run with 50 edits. No problems reported. Diffs: [6]. Hawkeye7 (discuss) 00:42, 18 April 2023 (UTC)Reply[reply]
  • Comment: Some behavior I found interesting is that the bot is reverting start-class classifications already assigned by a human editor, and overriding those with stub-class. [7] and [8] EggRoll97 (talk) 03:28, 18 May 2023 (UTC)Reply[reply]
    This should not be happening. Frostly (talk) 03:58, 18 May 2023 (UTC)Reply[reply]
    The question is: what should be happening? The article were flagged because some of the projects were not assessed. Should the Bot (1) assess the unassessed ones as stubs and ignore the assessed ones or (2) align the unassessed ones with the ones that are assessed? Hawkeye7 (discuss) 04:21, 18 May 2023 (UTC)Reply[reply]
    Per recent consensus assessments should be for an entire article, not per WikiProject. The bot should amend the template to use the article wide code. If several projects have different assessments for an article it should leave it alone. Frostly (talk) 05:03, 18 May 2023 (UTC)Reply[reply]
    @Hawkeye7: Courtesy ping, I've manually fixed up the edits where the bot replaced an assessment by a human editor. 6 edits total to be fixed out of 52 total edits. EggRoll97 (talk) 07:16, 18 May 2023 (UTC)Reply[reply]
  A user has requested the attention of a member of the Bot Approvals Group. Once assistance has been rendered, please deactivate this tag by replacing it with {{t|BAG assistance needed}}. This has been waiting for over 2 months since the end of the trial, and over 4 months since the creation of the request. Given the concerns expressed that the bot operator has since fixed, an extended trial may be a good idea here. EggRoll97 (talk) 05:19, 8 August 2023 (UTC)Reply[reply]

Operator: EpicPupper (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 02:55, Thursday, March 2, 2023 (UTC)

Automatic, Supervised, or Manual: automatic

Programming language(s):

Source code available:

Function overview: Replace AMP links in citations

Links to relevant discussions (where appropriate): BOTREQ, Village Pump

Edit period(s): Weekly

Estimated number of pages affected: Unknown, estimated to be in the range of hundreds of thousands

Exclusion compliant (Yes/No): Yes

Already has a bot flag (Yes/No): Yes

Function details: Using the AmputatorBot API, replaces AMP links with canonical equivalents. This task runs on all pages with citation templates which have URL parameters (e.g. {{cite news}}, {{cite web}}, etc).


  Approved for trial (50 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Primefac (talk) 10:27, 8 March 2023 (UTC)Reply[reply]

Just noting that I'm working on this but it may take some time. EpicPupper (talk) 23:01, 30 March 2023 (UTC)Reply[reply]
Been a bit busy IRL, but will get to this soon. Frostly (talk) 20:33, 25 June 2023 (UTC)Reply[reply]

Bots that have completed the trial period

Operator: Primefac (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 12:48, Thursday, May 11, 2023 (UTC)

Function overview: Convert template use following update

Automatic, Supervised, or Manual: Automatic

Programming language(s): AWB

Source code available: WP:AWB

Links to relevant discussions (where appropriate): Wikipedia talk:WikiProject Templates § Request for a template

Edit period(s): OTR

Estimated number of pages affected:

Namespace(s): 783

Exclusion compliant (Yes/No): Yes

Function details: {{Wikisource author}} recently was updated to allow for a |lang= parameter to link directly to non-English versions of wikisource for an author. A similar template, {{Wikisourcelang}}, links to a generic search on said language wiki for said author. This task will change {{Wikisourcelang|<lang>|otherstuff}} into a {{Wikisource author|lang=<lang>|otherstuff}} call.


  • {{BAG assistance needed}} valid request not attended by any BAG members for almost two months. —usernamekiran (talk) 23:02, 29 June 2023 (UTC)Reply[reply]
  •   Approved for trial (25 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete.
This seems pretty straightforward. Let's go to trial.
Headbomb {t · c · p · b} 17:38, 2 July 2023 (UTC)Reply[reply]
  Trial complete. Edits. As a note, I did not run genfixes just to make the proposed change more obvious, but if this task does proceed I will be running genfixes alongside them.
Piotrus, I think this request is a little more convoluted than initially requested. Languages such as de do not use an "author" prefix (see e.g. Adolph Friedrich Johann Riedel and his corresponding page on de Wikisource), but I can't figure out which languages it holds to. I am not necessarily seeing a specific pattern between what languages do and do not. My thoughts are of two possibilities - run this task only for languages where the proposed change has the intended effect, or just scrap this BRFA and do these changes manually. Primefac (talk) 12:52, 4 July 2023 (UTC)Reply[reply]
@Primefac I think we can run it for some languages that we can determine now, it shouldn't be that hard as long as it is consistent for each language (ex. German never uses, Polish always uses it, etc.). We could create a list for all languages that wikisource exists on, or just run it for now for some editions that are the biggest (ex. the ones with interwikis here). I did some checks and it seems it's pretty consisten - just a wikisource naming convention. Note that depending on the language, the "author" prefix is different - Polish is "autor". Swedish is "Författare", etc. In the end, what we need to fix is not the outgoing links but the text on our side. Consider this case, similar to the German one you quote, where we improved the language or our template but messed the link: before, diff, after. Since the links work, can we just figure out the way to change the wording in the template but retain the same link as before? The older template was able to do it, somehow, seems we are introducing a new error somehow? Piotr Konieczny aka Prokonsul Piotrus| reply here 04:48, 5 July 2023 (UTC)Reply[reply]
If you wouldn't mind making a list of which languages use the Author (in whatever language) prefix, I can hard-code their use into the template so that there isn't any issue.
This wasn't a problem before because {{wikisource author}} only linked to to the English version so no translation or odd coding was necessary. As mentioned in the original discussion, {{wikisource lang}} just links to a general search (which does sometimes turn up the author page directly) and thus does not require the "Author:" prefix. Primefac (talk) 08:06, 5 July 2023 (UTC)Reply[reply]
  On hold. Just for now, while we deal with actual template issues. Primefac (talk) 08:31, 5 July 2023 (UTC)Reply[reply]
@Primefac See talk, is this helpful? Those are most larger Wiki source projects, should be enough to get most of our stuff sorted out. We can take a look at what, if anything, is left after dealing with those languages? Piotr Konieczny aka Prokonsul Piotrus| reply here 07:06, 7 July 2023 (UTC)Reply[reply]
Should do, thanks for that. Going to keep this on hold for a bit longer, there's a TFD for merging all of these together and I might be able to enact these proposed changes during the merge process. Primefac (talk) 08:13, 7 July 2023 (UTC)Reply[reply]
@Primefac Just checking the status of this? Piotr Konieczny aka Prokonsul Piotrus| reply here 09:54, 22 September 2023 (UTC)Reply[reply]
Somewhat stalled, been rather busy myself and it doesn't look like anyone has started work on the template merger. I think I might have cleared my on-wiki plate somewhat (touch wood) so I'll see about prioritising the merger. Primefac (talk) 10:51, 22 September 2023 (UTC)Reply[reply]

Approved requests

Bots that have been approved for operations after a successful BRFA will be listed here for informational purposes. No other approval action is required for these bots. Recently approved requests can be found here (edit), while old requests can be found in the archives.

Denied requests

Bots that have been denied for operations will be listed here for informational purposes for at least 7 days before being archived. No other action is required for these bots. Older requests can be found in the Archive.

Expired/withdrawn requests

These requests have either expired, as information required by the operator was not provided, or been withdrawn. These tasks are not authorized to run, but such lack of authorization does not necessarily follow from a finding as to merit. A bot that, having been approved for testing, was not tested by an editor, or one for which the results of testing were not posted, for example, would appear here. Bot requests should not be placed here if there is an active discussion ongoing above. Operators whose requests have expired may reactivate their requests at any time. The following list shows recent requests (if any) that have expired, listed here for informational purposes for at least 7 days before being archived. Older requests can be found in the respective archives: Expired, Withdrawn.