Wikipedia:Bots/Requests for approval/DASHBot 4
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was Approved.
Operator: Tim1357
Automatic or Manually assisted: Automatic
Programming language(s): Python, Automator, and Applescript (im pretty lazy :) )
Source code available: Nope.
Function overview: Mark pages that are tagged for deletion as patrolled.
Links to relevant discussions (where appropriate): Discussion at New Pages Patroll
Edit period(s): Continuous
Estimated number of pages affected: 1 in 25 new pages.
Exclusion compliant (Y/N): N
Already has a bot flag (Y/N): N (but it needs one to function better)
Function details:
- Get large amount of new pages. (Using query)
- Get list of articles that are that are marked with AFD, CSD, or PROD tag. (Using cat-scan because I was getting mad at the API)
- Get RCID for pages that are in the intersect of new pages and tagged articles, using query.
- Patroll pages (using index.php instead of api.php)
- Wait 5 minutes
- Go again.
Note: I was going to do this all in python but I could not figure out how to configure cookies, so I interface the python script to the API with automator (its messy but it gets it done). Then, to loop it: I have to use Applescipt (Automator does not have an infinite loop option).
Discussion edit
I need the ability to patrol pages and a bot flag. (This is because currently the API limits me to 500 of the most recent pages, and I want to do more.) Tim1357 (talk) 17:29, 22 December 2009 (UTC)[reply]
- Is there a reason you are using index.php instead of api.php, I am not highly competent in that area, but I was under the impression the API was preferred. MBisanz talk 06:36, 27 December 2009 (UTC)[reply]
- Yes. The simple reason is that index.php requires a title and an rcid (with cookies automatically taken care of). Api.php requires title, rcid, and a token. Plus, there is no prebuilt function in any of the python packages (pywikipedia or wikitools) to do this. If there were a lot of pages that were to be affected by this I would build my own API function. However, because of the limited number, I figured it would be ok to use index.php. Tim1357 (talk) 06:55, 27 December 2009 (UTC)[reply]
- I've done a few so far. Stop me any time. Otherwise I will stop in 3 more days. Tim1357 (talk) 06:10, 29 December 2009 (UTC)[reply]
Approved. MBisanz talk 04:36, 1 January 2010 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.