Template talk:Infobox UK university rankings

Switch statements

edit

Currently, this template is available for use. A user would put it on a page, go look up the ranking for the school in one or more sources and list it. The problem is that the information likely changes every year. Thus, an untold number of schools each needs their page updated to reflect changes in the reporting newspapers. Would it not be better to incorporate a switch in this template, so that school rankings can automatically be updated? {{PAGENAME}} initially seems like a great choice for a switch statement. For instance (by the way, if you view the following in edit mode it looks a lot neater)...

| <tr><th>''[[The Sunday Times]]'' <ref name="The Sunday Times University Guide 2008 ">{{cite web|url=http://extras.timesonline.co.uk/stug/universityguide.php |title=The Sunday Times University Guide (2010) |author=The Sunday Times |year=2010 |accessdate=2010-01-29}}</ref></th> <td>{{#switch: {{PAGENAME}} |University of Oxford=1 |University of Cambridge=2 |Imperial College London=3 |University College London=4 |University of St Andrews=5 |University of Warwick=6 |Durham University=7 |University of York=8 |London School of Economics and Political Science=9 |University of Bristol=10 |University of Bath=11 |University of Southampton=12 |King's College London=13 |University of Nottingham=14 |Loughborough University=15 |University of Edinburgh=15 |University of Exeter=17 |University of Sheffield=18 |Lancaster University=19 |University of Leicester=20 |University of Birmingham=20 |University of Glasgow=20 |University of Sussex=20 |University of Leeds=24 |Newcastle University=24 |University of Manchester=26 |University of Aberdeen=27 |University of East=28 |University of Liverpool=29 |Queen Mary University of London=30 |Cardiff University=31 |University of Stirling=32 |University of Surrey=33 |University of Dundee=34 |Royal Holloway University of London=35 |University of Reading=36 |Aston University Birmingham=37 |School of Oriental and African Studies=38 |University of Strathclyde=39 |Keele University=40 |University of Kent=41 |Queen's University Belfast=42 |Aberystwyth University=43 |Goldsmiths University of London=44 |Heriot-Watt University=45 |University of Essex=46 |University of Hull=47 |University of Buckingham=48 |Brunel University=49 |Oxford Brookes University=50 |City University=51 |Swansea University=52 |University of Bradford=52 |The Robert Gordon University=54 |University of Ulster=55 |Harper Adams University College=56 |University of Brighton=57 |Nottingham Trent University=57 |De Montfort University=57 |University of Portsmouth=60 |University of Plymouth=61 |Bangor University=62 |Glasgow Caledonian University=63 |Edinburgh Napier University=64 |Sheffield Hallam University=64 |University of the Arts London=66 |University of the West of England=67 |Northumbria University=68 |University of Central Lancashire=68 |Bournemouth University=68 |University of Hertfordshire=71 |Kingston University=72 |University of Salford=73 |University of Lincoln=74 |University of Huddersfield=75 |University of Teesside=76 |Bath Spa University=76 |Manchester Metropolitan University=78 |Coventry University=79 |Queen Margaret University Edinburgh=80 |University of Abertay Dundee=81 |Birmingham City University=82 |University of Chichester=83 |University of Chester=83 |University College Falmouth=83 |University of Wales Institute Cardiff=83 |University of Sunderland=87 |University of Gloucestershire=88 |Liverpool John Moores University=89 |University of Westminster=89 |Canterbury Christ Church University=91 |University of Greenwich=91 |Edge Hill University=93 |York St John University=93 |Staffordshire University=95 |University of Winchester=96 |Leeds Metropolitan University=97 |University of Wales Lampeter=98 |University of Glamorgan=99 |University of Northampton=100 |Roehampton University=101 |University of Bedfordshire=102 |St Mary's University College Twickenham=103 |University of Cumbria=104 |Newman University College Birmingham=105 |Thames Valley University=106 |University of East London=107 |Trinity University College=108 |University of Wales Newport=109 |University for the Creative Arts=110 |Anglia Ruskin University=111 |University of Derby=112 |Glyndwr University=113 |Leeds Trinity=114 |London South Bank University=115 |University of Worcester=116 |Middlesex University=117 |University College Plymouth St Mark & St John=118 |Buckinghamshire New University=119 |University of Bolton=120 |University College Birmingham=121 |Southampton Solent University=122 |UHI Millennium Institute=N/A |Birkbeck College London=N/A |The Open University=N/A |University Campus Suffolk=N/A |University of the West of Scotland=N/A |N/A}}</td></tr>

Anyone have any questions/concerns? Banaticus (talk) 02:28, 30 January 2010 (UTC)Reply


Make year values editable

edit

While updating Bath Spa University's article to include the 2015 Complete University Guide ranking, I noticed that this infobox doesn't allow users to edit the year values that accompany each ranking. I'd like to propose making the year value editable, and maybe even allowing the last three years' entries for each awarding body. For example:

| Complete1     = 75
| Complete1Year = 2013
| Complete2     = 79
| Complete2Year = 2014
| Complete3     = 69
| Complete3Year = 2015

David Bailey (talk) 13:04, 12 May 2014 (UTC)Reply

Expansion of inclusion

edit

I suggest to encompass two international rankings, the Best Global Universities by US New & World Report, which previously collaborated with QS, and the ARWU Alternative Ranking. Since only overall rankings with multiple factors should be included in the template, I'm not sure if the THE Reputation Rankings is worth a place here. Biomedicinal (contact)

I also agree with Biomedicinal about the Best Global Universities by US New & World Report, which I think our one of the most widely utilized rankings in the US.
I would actually propose the inclusion of the CWTS Leiden Ranking as it is the fourth global metric used for universityrankings.ch. rather than the US News & World Report. The website has handily included UK universities' annual performance in all four of the rankings since they have begun. This makes it easy for users to see the performance of universities throughout the years. The Leiden rankings are also regularly included in the press releases of UK universities.[1] EmyRussell (talk) 16:41, 6 December 2015 (UTC)Reply
I'm slightly supportive of including the Leiden rankings, they do seem to be quoted in the media from time to time. We should avoid comparing to what universities quote as they will be biased anyway. Like Biomedicinal I'm for removing the THE reputation rank, I don't know what it adds and had not seen it before it was added to this template Aloneinthewild (talk) 20:23, 6 December 2015 (UTC)Reply
I am for removing the THE reputation as well. I believe that the CWTS Leiden ranking would be a suitable replacement of it, but I am happy to consider others if a strong case is argued for another instead. Yes, I agree with you that is something important to consider. I mainly support Leiden because it appears in the universityrankings.ch. EmyRussell (talk) 23:54, 6 December 2015 (UTC)Reply
But is CWTS Leiden Ranking too science-centered as without any non-science indicator? ARWU is also a bit biased for science but it does take care of humanities-based institutions. Biomedicinal (contact)
The Leiden Rankings measures most fields, including social sciences and humanities, please take a look at the website and click on the 'Fields' option. At the bottom, social sciences and humanities is one of the available options from the dropdown menu.[2] I would actually say that ARWU is more biased towards to the STEM fields than Leiden is. EmyRussell (talk) 17:30, 7 December 2015 (UTC)Reply
I also agree with Biomedicinal and others about removing THE reputation rank. In terms of Leiden rankings, I understand that they are very newsworthy, but of questionable utility and accuracy. For example, in terms of top schools there is a substantial difference btwn Cambridge (ranked 18th globally) with Zhejiang University (ranked 4th globally). No student in the UK or US would choose Zhejiang University over Cambridge. I also agree with Biomedicinal about the US News and World Report "Best Global Universities" ranking, which is highly utilized by students globally, and may be more practical than the Lieden rankings. It is the primary ranking for US schools also. There is alternatively a US News and World Report for Global Universities in Europe if this is preferred too.
I agree with Biomedicinal that the US News and World Report seems a useful indicator. Mikecurry1 (talk) 09:14, 2 November 2017 (UTC)Reply

References

Paywall

edit

The Times cannot be viewed without a subscription. Should this parameter have a paywall tag? Leschnei (talk) 00:35, 7 August 2016 (UTC)Reply

Inclusion of CWTS Leiden ranking

edit

User Crpjwovdelete (talk · contribs) has been removing the CWTS Leiden Ranking from the infoboxes of the universities, thus effectively nullifying the earlier consensus here on inclusion. I have asked him to discuss this hear rather than continuing to remove rankings and revert their re-insertion,. I would also like to invite the participants in the previous discussion here, biomedicinal (talk · contribs), Aloneinthewild (talk · contribs) and EmyRussell (talk · contribs) (it appears there was at least one anonymous contributor).

The charges made (in edit summaries) against the CWTS Leiden rankings are:

  1. "CWTS/Leiden is not as reputable or even comparable as the Top 3 ranking institutions: ARWU, QS and THE."
  2. (Following my noting that this was contrary to the consensus on this talk page) "Previous discussion to not include all the rankings, but main ones. Discussion from UK UniversityRankingsTemplate that Leiden's a biased indicator"

In response to these, I would say that the decision to include a ranking in the template is identical to a decision to include it on pages that use that template. Further more, I do not see the earlier discussion as indicating the CWTS Leiden ranking to be biased - the question of a pro-science bias was raised, but it was shown that it includes both sciences and humanities, and has probably less of a bias that ARWU (I would agree with that assessment). Furthermore, all rankings are, by their nature, biased in one way or another. If we are not to include rankings because they have biases, we would not include any rankings. What is important is whether the ranking is well established internationally - and it seems to me that there is plenty of evidence that CWTS Leiden is. It has received significant coverage in the international press (e.g. [1][2][3][4]) and is frequently referenced by universities (e.g. [5][6][7]). It seems to me that the CWTS Leiden ranking is well established and should continue to be included both in the template and as a filled-in parameter on pages where the template is used, but I would be interested to know what other think.Robminchin (talk) 02:13, 22 May 2017 (UTC)Reply

I noticed all of Crpjwovdelete (talk · contribs)'s edits as well and was confused by the comments he had left after each edit: "Previous discussion to not include all the rankings, but main ones. Discussion from UK University Rankings Template that CWTS is a biased indicator" - the previous consensus we held was rather the opposite: we should include the Leiden ranking. In fact, I am slightly irritated that he removed all these edits without consultation on this page beforehand and I would like them to revert all the edits they made or alternatively I will do so.
I follow your line of thought, Robminchin (talk · contribs) and I do not believe their argument holds much merit. EmyRussell (talk) 02:51, 22 May 2017 (UTC)Reply
I've reverted the rest of the edits I could see by Crpjwovdelete, seems like a single-purpose account. I'm not really sure about the leiden, I guess we did have consensus to include it but I don't know enough to judge it as a ranking metric. (I'll come back and comment here when its not so late) Aloneinthewild (talk) 22:52, 22 May 2017 (UTC)Reply

I would also oppose CWTS ranking. Read UCL talk page. It is considered very biased. For example, there are 6 University of California Campuses in the top 20. While UCLA and Berkeley are good schools, I do not think UC Santa Barbara should be ranked above Oxford. There is a bias towards larger universities which the UC's are; rather UC Santa Barbara is not necessarily better at research than Oxford. Similarly, Caltech a small research powerhouse is ranked 175, because of its small size.

Criticisms of Lieden not yet discussed

  1. Bias towards Large Institutions
  2. Useful of Ranking (it is not in alignment with any of the other major rankings... ex. 5 of the top universities in the world are from very large institutions in china, seoul, brazil, and canada) which are not ranked nearly as highly other places, and few students should base school decisions off of this metric.
  3. Alignment of Rankings to the other major indicators.... (very far off... for example, University of Sao Paolo with over 88,000 students is ranked 8th in the Lieden ranking and between 100-250 in the QS, THE, and ARWU)

I think if we were going to include rankings there would be better indicators than Lieden to use, such as the Reuters World's Innovation Rankings (which would add a new dimension to the rankings), The Ranking of Rankings (which is a composite of all 5 major rankings), or possibly US News and World Report. Lieden is interesting, but as the UCL talk page shows has considerable flaws. Also the other major ranking textboxes do not use Lieden, such as US, Europe, or Asia. — Preceding unsigned comment added by 2605:E000:6003:5900:DDFF:A270:DEF:B4C4 (talk) 08:23, 8 June 2017 (UTC)Reply

The Leiden PP(top 10%), which is the one generally used, ranks institutions by the proportion of their publications that are in the top 10% by citation. By looking at the proportion, it corrects for bias against larger institutions. Caltech, for example, is ranked 7th by Leiden and Sao Paulo 765th. (The documentation for the infobox should probably be updated to clarify that this is the ranking used, it currently makes no mention of the Leiden ranking.) I would also note that ARWU makes no correction for size and contains an inherent bias. As I have said before, if we are going to exclude rankings based on their biases, ARWU would be the first to go. The US News & Global Report (not currently in the list here) also has a built-in bias towards larger institutions, with 40% of the ranking based on raw metrics uncorrected for size.
I was unable to find a discussion of the Leiden ranking on the UCL talk page or the archives - please provide a link.
However, at the bottom line, whether we consider a ranking to be biased or not, or that it does not reach the same conclusion as other rankings, is not what this decision should be based. What matters is that it receives significant coverage that shows that it is considered a ranking worth reporting on.
To comment briefly on the other rankings you mention:
The Reuters World Innovation Ranking is very specific - more akin to a subject ranking than a general ranking. Furthermore, its somewhat bizarre treatment of the University of London as a single institution makes it unsuitable for use in the UK, as this means it omits major universities such as UCL, KCL, LSE, etc.
The US News global ranking is fairly new and has not, as far as I have seen, received much coverage outside of the US. If it does (or if it has and I have missed it), then it should be included.
The infobox doesn't generally include composite rankings (e.g. the THE table of tables for UK national rankings); I was unable to find anything about the "Ranking of Rankings", which isn't mentioned in the College and university rankings article.
Robminchin (talk) 00:38, 10 June 2017 (UTC)Reply
On a side note, I do 100% agree with you about the ARWU.
Oh, to access PP(top 10%) one would need to click several configuration settings. Why PP(top10% over P top 10% or P) While PP(top 10%) corrects some bias in this Leiden ranking towards large universities, it is not the only ranking that should be used in determining a world ranking. Some people are using the Leiden rankings without using this PP (top 10%) feature to indicate a universities Lieden ranking. Without PP (top 10%, Caltech is not ranked 7th overall (just for that PP(top10% measure). Caltech is ranked 172 overall well below a lot of not good research universities, example, just below University of Georgia- Athens, which no one has heard of. I just think maybe some people want the Lieden rankings, but others think it is quite biased. It does not need to be included on every university page.
Another option, Uni Ranks: https://uniranks.com/ranking

— Preceding unsigned comment added by 2605:E000:6003:5900:ACDF:7F38:B0B2:2859 (talk) 14:25, 15 June 2017 (UTC)Reply

I think the scores included in the rank box for UK universities are all for PP(10%), which has generally been what universities refer to for the obvious reason that it measures quality rather than quantity (as I said above, this should be made explicit). In a ranking purely by quantity, I'm not surprised Caltech comes below UGA (which I have heard of, btw) - that's still a valid ranking, however, just a not particularly useful one if you want to measure quality.
Nobody is saying the Leiden ranking is the only one that should be used, just that it should be one of the ones listed. That's why we list many different rankings and let the reader decide which they want to use. The problem with the UniRanks ranking is that: a) it isn't actually a ranking, it's a mix of the other rankings, so why not just show those? b) The Reuters innovation rankings are meaningless in the UK context due to treating the University of London as a single institution, making the Uniranks ranking (which includes the Reuters ranking) similarly problematic - it can't give a valid rank for UCL, KCL, LSE, etc. Robminchin (talk) 17:00, 15 June 2017 (UTC)Reply
I came here because of this edit by Robminchin at University College London: https://en.wikipedia.org/w/index.php?title=University_College_London&type=revision&diff=792837033&oldid=792821325. I see no consensus here for the use of a form of the CWTS Ranking which is not the one which is presented most prominently on the official website of CWTS Leiden: [8]. To not use the default ranking of the ranking producer is both original research, and misleading. Readers will expect the ranking presented in the ranking table to be the "official" ranking. If readers have an issue with the quality of the CWTS Ranking, they should find credible sources for their critique, and then add content here https://en.wikipedia.org/wiki/CWTS_Leiden_Ranking. If the ranking is so defective it should be excluded from this template (I am neutral on that, although tend to think that just AWRU, THE and QS is preferable to avoid bloat), but the approach suggested by Robminchin is the worst of all worlds.88.98.200.46 (talk) 22:04, 29 July 2017 (UTC)Reply
Worth noting also that Robminchin's preferred PP(top 10%) measurement places Rockefeller University ahead of MIT, Harvard and Stanford, puts Rice University ahead of Yale and Columbia, and the University of Exeter ahead of John Hopkins. In short, it is of highly questionable quality as a ranking leaving aside the above. 88.98.200.46 (talk) 22:22, 29 July 2017 (UTC)Reply
Thanks for coming here to discuss this. As it stands, following your re-revert, UCL is currently out-of-line with other universities using this infobox – it would have been better to discuss first and, if consensus were for a change, to change all institutions rather than simply UCL. This would also be in keeping with WP:CYCLE, which advises not to re-revert a contested edit prior to discussion.
That the concensus is for PP(top10%) can be seen by looking at the other UK universities using this infobox. You should also note the PP(top 10%) numbers on the UCL page (prior to your edit) were placed by Y2jO3Nx, not by me. As far as discussion on this page goes, you will see that the consensus has been to follow the global rankings considered significant by universityrankings.ch (from the Swiss State Secretariat for Education, Research and Innovation) rather than for Wikipedians here to make their own determination of which rankings are significant. See, for example, EmyRussell's reply in the discussion below where I suggested the addition of the U.S. News and World Report rankings. This uses the Leiden PP(top 10%) ranking, as can be seen here.
As to the quality of rankings, as stated elsewhere on this page that is not for us to determine. However, you should note that the 'P' ranking, which you have inserted in the UCL article, is simply a ranking by number of papers produced. This says nothing about quality, only quantity - thus the University of Sao Paulo ranks 7th and Caltech 172nd. The PP(top 10%) is a ranking by the proportion of papers published that are in the top 10% of cited papers in their field and is thus a measure of quality of publications (as measured by citations) that is independent of the size of the institution. This can allow small, specialist research institutions – such as Rockefeller University, the London School of Hygiene and Tropical Medicine and the Weizmann Institute of Science – to move ahead of larger (and possibly more famous) institutions.
I am a bit concerned by your assertion in your edit summary that my reversion to the PP(top 10%) was 'add[ing] negative content'. I don't see how using the same measure in the infobox as other UK universities can possibly be seen as negative. You should remember that the UCL article on Wikpedia does not exist to promote UCL but to give information from a neutral point of view. That the 'P' ranking gives a higher position to UCL (which would presumably be seen as 'positive') is not a valid argument for its use.
Robminchin (talk) 15:39, 30 July 2017 (UTC)Reply
As there does not seem to be any support for moving to the 'P' ranking instead of the 'PP (top 10%)' ranking, I am restoring the PP (top 10%) numbers to the UCL page and will update the documentation here as suggested below. Robminchin (talk) 16:12, 6 August 2017 (UTC)Reply
Adding my opinion, I also do not believe the Lieden ranking should be used. Mikecurry1 (talk) 09:12, 2 November 2017 (UTC)Reply

CWTS Leiden Ranking should be displayed in the info box. People have conflicting views regarding the existing ranking tables, this does not warrant their omission on what is an encyclopaedic article. I have seen various criticisms towards The Guardian rankings for instance, which place a disproportionate focus on student experience. CWTS Leiden Ranking is a notable and well-respected ranking, updated consistently each year. The info box used currently is not balanced, particularly when people are looking for national rankings to compare UK universities. It provides a very limited selection of national rankings all which use similar data sets such as the NSS. The University of Oxford and University of Cambridge no longer engage with the NSS, and yet their score is not penalised by these ranking tables. CWTS, QS, ARWU and the like do not use the NSS, which alongside UK rankings offer a more balanced view of a university’s prestige. Dr.AndrewBamford (talk) 20:21, 23 June 2022 (UTC)Reply

Proposed update to documentation

edit

As there has been some confusion (see discussion above) over the Leiden rankings, and currently nothing is said about them in the documentation (or about national rankings taken from global rankings), I propose the following update to the table:

Parameter Explanation
ARWU_W The Academic Ranking of World Universities (ranking among all constituents).
QS_W The QS World University Rankings (ranking among all constituents).
THE_W The Times Higher Education World University Rankings (ranking among all constituents).
LEIDEN_W The CWTS Leiden Rankings (ranking among all constituents using the (PP top 10%) scale).
ARWU_N, QS_N, THE_N, LEIDEN_N As above but ranked only among constituents in the United Kingdom
Complete The Complete University Guide, an annual ranking of UK universities produced by Mayfield University Consultants.
The_Guardian The Guardian University Guide, an annual ranking of UK universities produced by The Guardian.
Times/Sunday_Times The The Times and Sunday Times University League Tables, an annual ranking of UK universities jointly produced by The Times and The Sunday Times.

Comment: The PP top 10% Leiden ranking is the one used by universityrankings.ch and in the university press releases linked to in the earlier discussion. It is also what has historically been used in this parameter, having been the default ranking on the page until last year. It appears to be the one of the three possible Leiden rankings that is the most notable.Robminchin (talk) 18:30, 18 June 2017 (UTC)Reply

Include U.S. News and World Report Best Global University Ranking? Other rankings?

edit

@EmyRussell, Aloneinthewild, and Biomedicinal: I see that including the USN&GR ranking was discussed a couple of years ago and it was brought up again recently but not really discussed. Looking around, it seem that the U.S. News & World Report Best Global University Ranking is now more established than I had realised as an important international ranking that we should consider including (even though I dislike size-dependent rankings in principle). It is being used by a number of British universities, e.g.: Liverpool[9], Southamption[10] and LSHTM[11] and departments at Leeds [12] and UCL[13].

Are there other international rankings that we should also consider (e.g. Round University Ranking or URAP)?

However, I think that if we do add more global rankings then, in order to avoid overloading the infobox, we would have to remove the national sub-rankings from the global rankings (i.e. ARWU_N, QS_N, THE_N, LEIDEN_N). Robminchin (talk) 19:07, 18 June 2017 (UTC)Reply

Hi @Robminchin, in my opinion, it should not be up to any Wikipedians to decide which world rankings should be included in the infobox. My support for the current 4 world rankings which feature (ARWU, QS, THE and Leiden) is due to them being selected by the State Secretariat for Education, Research and Innovation, a Swiss government agency, which monitors how Swiss universities perform in world rankings. See: http://www.universityrankings.ch/en/.
Of course, there are arguments regarding the relevancy of a Swiss Government agency deciding which world rankings should feature in a British university infobox, but I believe that the merit of a non-biased governmental agency selecting, what they deem to be, the most important world rankings trump this argument. Universities will inherently report on what university rankings they perform well in, so the aforementioned links you provided should be taken with a pinch of salt. I believe that if a user wishes to include further rankings, then they are free to do so in the 'Reputation and Rankings' section of the Wikipedia page but I am less keen for all world rankings to feature in the university infobox.
I am also not sure about removing national sub-rankings in favour of adding more world rankings as the university info boxes of universities from other regions contain this information.
This is just my opinion though, and I am open to hearing the view of others. EmyRussell (talk) 21:34, 18 June 2017 (UTC)Reply
Hi @Robminchin:, in my opinion I think that is a great idea. I agree with you and think that is a solid point to implement. I think the US News & World Report is a very influential ranking. It is already on the United States Wikipedia for Global Rankings. I also do not see the point of having national rankings in the world university ranking column, which seems cumbersome. In my opinion there are the national rankings right below. I think you had a good idea that could be implemented in my opinion.
I agree with RobMinchin's proposal on the inclusion of this ranking and the removal of national university rankings from the world university column. I think this is quite a popular and influential indicator now.Mikecurry1 (talk) 09:11, 2 November 2017 (UTC)Reply
Agreed. Please see "Grouping into national and global rankings" for the latest update. (talk) 07:37, 10 April 2018 (UTC)Reply

Proposal to include Teaching Excellence Framework in infobox

edit

@Robminchin, Aloneinthewild, and Biomedicinal:

What are your thoughts on the inclusion of the 2017 TEF results into the infobox of UK universities? As the results of the 2014 REF results were used for the research component for all three domestic league tables, I did not see the necessity to include government-backed assessments of British universities into the infobox. However, the TEF results will not be used as a component to form domestic league table positions so I would like to propose the inclusion of the results into the infobox.

I'm aware that there are controversies surrounding the results, and even behind the methodology, but I believe that the results will be of significant interest to wikipedia viewers.

See: http://www.hefce.ac.uk/tefoutcomes/ for the list of higher education institutions which were assessed. EmyRussell (talk) 22:38, 6 September 2017 (UTC)Reply

I would support this. There are certainly controversies surrounding the results and methodology, but the same could be said for any of the tables included here, and the controversies can be seen clearly by anyone looking at the TEF article on Wikipedia. It's certainly significant, and so should probably be included. Presumably it would just say Gold, Silver, Bronze or Provisional, rather than giving a ranking (as various publications have done with the results). Should it also state the year of the TEF award, as it is (IIRC) valid for three years (e.g. Gold (2017), or if awarded next year Gold (2018))? If the TEF should die, we can always revisit this. Robminchin (talk) 03:58, 9 September 2017 (UTC)Reply
@Robminchin: That's great to hear. I was hoping that others would comment on this proposition with their thoughts, but I guess it is just the two of us. Yes, I believe that it should just indicate the rating: Gold, Silver, Bronze or Provisional. I'm not sure whether the year should be included as it's not an annual exercise, but I also see the importance of including the year the exercise was conducted - what are your thoughts? EmyRussell (talk) 16:45, 19 September 2017 (UTC)Reply
I agree with Biomedicinal, AloneintheWild, Robminchin, and EmyRussell, this would seem useful.
I agree this is a useful indicator. Mikecurry1 (talk) 09:11, 2 November 2017 (UTC)Reply

Grouping into national and global rankings

edit

@EmyRussell, Aloneinthewild, and Biomedicinal:

An anonymous editor has changed the template to put the national sub-rankings of global rankings in with the national rankings. The previous layout, which made it clear that these were not national rankings but a national selection from a global ranking, was much clearer to me and I have restored this. Would other editors care to give their views on this? (I've pinged some frequent contributors to this page) Robminchin (talk) 21:43, 3 April 2018 (UTC)Reply

@Robminchin: I am against the edits made by the IP address editor, and believe that the template should be reverted to the previous layout [14]. The scale of changes implemented should have been discussed in the Talk page beforehand. On a minor note though, the lack of capitalisation (as per Wikipedia policy) for 'ranking' and 'assessment', and re-ordering irks me. Nevertheless, if this version had to be kept, national-sub rankings of global rankings belong in the global rankings section, and not the national rankings section. EmyRussell (talk) 00:14, 4 April 2018 (UTC)Reply
@EmyRussell and Robminchin: I happen to support the changes made, for the reasons below: Firstly, grouping (by national / international) is an established decision on Wikipedia already in use by nearly every other country's university ranking template - including the world and admin-locked US university rankings templates. Secondly, the UK university ranking template's past design [15] only uses simple text next to the publication year (to show whether a specific ranking was made compared to other UK schools only, or at the international level - a critical consideration); Personal preferences notwithstanding, this past design fails to take advantage of the graphical separation properties that Wikipedia has built in for more intuitive reading. Furthermore, with the latest change by Robminchin, a number of national rankings are currently incorrectly included into the "Global rankings" section. As per admin talk for the locked US template, the "global rankings" section is meant to specifically group rankings a school has received when compared to other institutions at the global scale. Alternatively, if a ranking was made simply comparing schools just within the nation, they would belong in the "National rankings" section. Using the University of Cambridge as an example, ARWU has ranked it Top 3rd globally, or Top 1st within the UK. In this instance, ARWU's global (3rd) ranking for Cambridge would fall in the school's "Global rankings" category, while the national (1st) ranking would fall in the school's "National rankings" section. Robminchin's latest change is a departure from Wikipedia's existent admin-locked US template, and I'm not sure why that should be the case? Considering the merits of content standardization and style uniformity, both key elements of the Wikipedia style guide, I'm not sure why these changes should not be adopted. Having said that, I do agree the capitalization could be better handled. Derek328 (talk) 09:31, 4 April 2018 (UTC)Reply
There is a significant difference between a national subset extracted from a global ranking and a national ranking. For the world university rankings, it is clear that the regional rankings must be subsets of the global rankings as the template is explicitly about world rankings. For the US the national subset is only given for ARWU, not for any of the other global rankings listed, which seems indefensible. (USNWR and THE both produce separate national and global rankings with different methodologies; their national rankings are not national subsets of global rankings.)
The positioning of the national subsets of the global rankings with the full global rankings makes it clear that these are not national rankings but national subsets of global rankings, while placing them with the national rankings makes it appear that they are methodologically independent rankings. More consistent with the US template would be to omit the national subsets of the global rankings altogether, as is done there for all but ARWU. Robminchin (talk) 15:58, 4 April 2018 (UTC)Reply
@Robminchin: Thanks for the quick response. I believe you make a good point, particularly regarding how we may better align the UK university rankings template to that of nearly every other country at the moment (including the admin-locked US university rankings template): that there is a difference between "a national subset extracted from a global ranking" vs a "national" ranking. Using the University of Cambridge again as an example, its QS ranking in 2018 was Top #5 globally, but it was never officially given the ranking of "Top #1" by QS as currently implied, even though it appears so when the full global ranking was custom-filtered by nation. After reviewing all 12 rankings currently listed in the UK university ranking template, and considering the admin-locked US template does not consider custom-filtered results as an officially awarded ranking, it seems the fair solution forward for the UK university ranking template would be to remove data derived from custom filters that limit / omit portions of a ranking system as it was originally published.
This is a great step to achieving content standardization and style uniformity, both key elements of the Wikipedia style guide. Thank you for your inputs, and I will make the changes necessary to reflect this updated consensus. Derek328 (talk) 07:37, 10 April 2018 (UTC)Reply
As I said before, the inclusion of a national sub-ranking for only the ARWU seems indefensible. Note that the US template being admin-locked is a reflection of party edit ears, not some kind of imprimatur. For consistency, we either need to include national sub-rankings for all of the global rankings or for none. For now I'll remove the ARWU national ranking, but I would welcome the input of others editors on this. Robminchin (talk) 03:10, 11 April 2018 (UTC)Reply
Sure thing, thank you for your contribution. As for your point regarding the US's inclusion of a national ARWU sub-ranking, while I understand it is not meant to be some kind of imprimatur as you mentioned, I believe the ARWU ranking is included in the "national" section because schools apparently do receive an official, national ranking published directly from the ARWU - even though methodologies remain the same. The difference here, I believe, may come down to the fact that it is an officially awarded ranking, as opposed to one merely appearing so when a full global ranking is being custom-filtered by nation. Having said that, I'm happy with the new format for the UK template and agree this is already a major step forward in terms of compliance with the Wikipedia style guide. Thank you. Derek328 (talk) 04:30, 11 April 2018 (UTC)Reply
I also agree with Robminchin's idea here to remove the national rankings from the global rankings. Great edit, thanks! — Preceding unsigned comment added by Mikecurry1 (talkcontribs) 08:03, 11 April 2018 (UTC)Reply

TEF updated

edit

Many universities now have updated TEF ratings, released yesterday - https://www.officeforstudents.org.uk/advice-and-guidance/teaching/tef-outcomes/ .

I updated Durham's as I'm an active editor there; but technically Durham's page is now wrong, because the TEF reference still points to an old version of the ratings. On the other hand, if I updated the reference, then every other university page would be wrong, as their ratings haven't been updated.

Is there a procedure by which these are meant to be updated all at once?

I do wonder whether one of the proposals made in the past might make this simpler - either that rankings be recorded here, with a switch statement, rather than on individual pages; or that there be a parameter for year. TSP (talk) 10:13, 7 June 2018 (UTC)Reply

I have now updated the reference, as the universities on my watchlist mostly seem to have had their ratings updated by individual editors; but there should probably be some way of checking them all to make sure none are out of date with the reference? TSP (talk) 14:40, 11 June 2018 (UTC)Reply
TSP, templates that generate named references to be used in both the template and the article have update problems!! I don't know what the solution is. The niceness in the reference list of not having duplicate references is outweighed in my opinion by the problem that these templates generate. (There are a lot of these templates which are regularly updated.) The current list of universities with problems is here. StarryGrandma (talk) 20:34, 22 June 2018 (UTC)Reply

Proposal to Include Reuters World or Europe's Most Innovative Universities Ranking

edit

This has become a popular ranking with many UK universities on it. Reuter's is a UK based organization. As Reuters evaluates innovation rather than research, it provides a fresh perspective on university rankings. Proposal to include the Reuter's ranking in the global ranking section. https://www.reuters.com/article/us-emea-reuters-ranking-innovative-unive/reuters-top-100-europes-most-innovative-universities-2018-idUSKBN1HW0B4 — Preceding unsigned comment added by Mikecurry1 (talkcontribs) 21:42, 24 September 2018 (UTC)Reply

I don't think it is a particularly useful or well known ranking, I would like to see more evidence before it is included. Robminchin (talk) 03:13, 25 September 2018 (UTC)Reply
I am providing some initial evidence for you as requested in line with the evidence liked on this forum for having government agencies determine which rankings to use. This Reuter's innovation ranking is used by the European Commission, the World Economic Forum, World Government Summit, and is also prominently used by the French Government in its push for French innovation, as well as the Hungarian Government, Singaporean Government, and various US States (such as Texas to determine innovation in its universities.) [16][17][18][19][20]
It is also promoted by a variety of UK universities as well as universities across the world to publish results including Manchester, Southampton, Cardiff, Dundee, Sussex, and included on the same Swiss State Secretariat for Education, Research and Innovation Website swissinfo.ch that was the primary justification for Lieden.[21][22][23][24][25]
It is also cited by QS, THE, and major publications, newspapers, and globally.[26][27][28][29]
Mikecurry1 (talk) 00:05, 26 September 2018 (UTC)Reply
Thanks. I've looked thorough the links, and they mostly seem to be people saying "our university [or universities] did well" – similar articles can be found for virtually any international ranking. (The website the list for this template has been taken from is universityrankings.ch, not swissinfo.ch, by the way.) There's nothing I can see saying that this is a major ranking or that governments are actually using it, rather than just advertising their universities' success in it (that a file with the rankings has been posted on a public forum on the European Commission's information site doesn't prove anything).
I also note that it doesn't have its own Wikipedia article, which would make it impossible to refer people to that article to find out more about the ranking. That's pretty much essential for its inclusion in the template. Robminchin (talk) 04:30, 26 September 2018 (UTC)Reply
The first links are government uses of this rankings from .gov sources or highly influential government policy sources. Here is a link where this is used by the minister of economy of France for explaining France's innovation. [30]
This was previously on the wiki page for college and university rankings. To address your second point, I have thus created a new wiki page Reuters - The World's Most Innovative Universities.Mikecurry1 (talk) 20:26, 26 September 2018 (UTC)Reply
I will be restoring this freshly created wiki page for Reuters - World Most Innovative Universities Ranking, it was speedily deleted without time for commenting, so that may take a couple days. :( Mikecurry1 (talk) 23:58, 26 September 2018 (UTC)Reply
As I said before, I don't see any evidence in those links of it being defined as a major university ranking or used for anything other than promotional purposes. That the French government use it in advertising is a long way from the French government using it to set university policy. If they were using it as a KPI for their universities over an extended period, which would be similar to universities.ch, then that might be an argument for inclusion, but so far there is no evidence of anything like that. In fact, neither of the French government links you posted were even using the universities ranking. The first one was Reuters' World's Most Innovative Research Institutions, about non-university research institutions, and the second was Reuters' Top 100 Global Innovators, about commercial companies. There really doesn't seem to be a case here. Robminchin (talk) 01:15, 27 September 2018 (UTC)Reply
I am also against the inclusion of this new ranking for the same reasons raised by Robminchin. EmyRussell (talk) 13:56, 27 September 2018 (UTC)Reply
From Science and technology in Hungary Wiki Intro, "The key actor of research and development in Hungary is the National Research, Development and Innovation Office (NRDI Office), which is a national strategic and funding agency for scientific research, development and innovation, the primary source of advice on RDI policy for the Hungarian government, and the primary RDI funding agency. Its role is to develop RDI policy and ensure that Hungary adequately invest in RDI by funding excellent research and supporting innovation to increase competitiveness and to prepare the RDI strategy of the Hungarian Government, to handle the National Research, Development and Innovation Fund, and represents the Hungarian Government and a Hungarian RDI community in international organizations."[1].(link in US News Section)
The new page created Reuters - The World's Most Innovative Universities was restored/approved after by the editor who deleted it. Originally on college and university ranking article. Mikecurry1 (talk) 04:27, 1 October 2018 (UTC)Reply
The Hungarian article is yet another news item, linking through to a press release from KU Leuven. It does not say anything about the status of the ranking.
To be clear, your continued advocacy on this issue with a string of low quality or completely irrelevant "sources" gives the appearance of agenda-pushing rather than being WP:here to build an encyclopedia. Robminchin (talk) 06:21, 1 October 2018 (UTC)Reply

References

  1. ^ "The National Research, Development and Innovation Office". NRDI Office.

Inclusion of US News and World Report

edit

Looking back over the discussions, I don't see any consensus to include this ranking, and most discussion is a long way from being current – so that that no change happened at the time implies a lack of consensus for change. I have commented out the US News ranking to give a chance for discussion to take place prior to implementation. I have also returned the template to alphabetical order, which is generally seen as the best way to avoid bias in ordering. Robminchin (talk) 03:26, 25 September 2018 (UTC)Reply

To kick off debate: the global rankings included in this template (ARWU, CWTS Leiden, QS, THE) are based on the objective criterion of their being defined as being the main international rankings by [universityranking.ch], a project of the Swiss Secretariat for Education, Research and Innovation (SERI) and swissuniversities run by SERI. This is in keeping with the Wikipedia philosophy that (as far as is possible) editorial decisions are objective – we follow outside experts rather than making decisions ourselves. If we are going to change, it should be to similarly objective criteria, not based on the feelings of editors as to how important rankings are out how much they are trumpeted by university PR departments.
Just to add, I proposed adding the US News ranking a bit over a year ago (see #Include U.S. News and World Report Best Global University Ranking? Other rankings?) and dropped the proposal on having this explained to me! Robminchin (talk) 16:08, 25 September 2018 (UTC)Reply
That is interesting that you had also proposed adding the US News ranking to the table as an option.Mikecurry1 (talk) 14:59, 28 September 2018 (UTC)Reply
As I said, that was based on my opinion, and my misapprehension that the rankings here were being decided by consensus of the opinions of editors as to which were the most important. Having which rankings to include decided by reference to external experts is far more in keeping with the Wikipedia philosophy. Personally, I think US News is a better ranking than ARWU, but not as good as the other three on here (and QS obviously has the best survey, having included me in it :) ), but I have to try to ignore my personal opinions when editing Wikipedia and follow external experts to create an objective encyclopedia. Robminchin (talk) 16:12, 28 September 2018 (UTC)Reply
I also agree with your opinion that the US News ranking is a better ranking than the ARWU. In terms of your opinion being better or worse than external experts, I have no clue why that Swiss Website is the arbiter of what should appear on a UK wiki ranking template, it gives this website way too much power. Perhaps you and I and others on this wiki page have more expertise. When I was reviewing that website further I learned that they for one show way more than only those 4 rankings, [31][32] showing how Swiss universities are ranking on global publications like Newsweek. Moreover, information on this website such as Newsweek as a global ranking is outdated "It is a mix of Times and 50 % of Shanghai indcators, 40% THES and 10% from library volumes, Type: Research oriented - Size: top 100 - Region: World." Newsweek started making rankings from 2006 to 2012 before any of these other major rankings have since gained influence. I cannot even find a 2018 global Newsweek ranking online. We have no clue in what capacity the Swiss government uses this website, or to what capacity decisions are formed off of it. This is a terrible way to base what is on a UK university template, so in exploring it further, it is outdated to its use, and simply shows how various Swiss Universities rank on many popular rankings reported and used heavily by people and the media. The website should clearly not be the arbiter of what should be presented on this current wiki.Mikecurry1 (talk) 05:31, 30 September 2018 (UTC)Reply
We don't follow our opinions rather than those of external experts because that would be antithetical to the basic principles of Wikipedia. If you don't understand, after all that has been said here, why we are following a website by government-backed experts, which clearly identifies four main global rankings, then that's nobody's problem but yours. Robminchin (talk) 17:00, 30 September 2018 (UTC)Reply
I can agree that is okay to use government sources as long as it is applied to various government sites, and not only this Swiss Website as the arbiter of what is included on a wiki template. I do not have government websites for this US News and World Report Global rankings. So if this is the government criteria we are using, I am okay to not include the US News Global Rankings, even though we are both of the opinion that it is better than the ARWU. It is important to take a critical eye at that Swiss Government Website as being far from ideal you have held to my government sources, as well as very outdated. I have previously used sources including the French Government, Singaporean Government, World Economic Forum, European Commission Website, World Government Summit (who sets policy for the United Nations), and others. I am now attaching a link to the Hungarian Government National Research, Development, and Innovation Office, which I think will be acceptable to you. If your criteria is government offices use, then I will not have a problem as long as a double standard is not held for this Swiss Website to not allow other government office sources as well. I hope you can keep an open mind towards the Hungarian Government's National Research Development and Innovation Office.[33]. Brief description above.Mikecurry1 (talk) 03:14, 1 October 2018 (UTC)Reply
That is a news item from a government office, not something remotely comparable to the Swiss site – a government run site with an expert-selected set of main rankings. Your previous efforts included multiple articles that weren't even about the ranking in question and documents posted on public forums, yet you persist in claiming these as "sources". You also claim the Swiss site is "very outdated" without any evidence whatsoever (actually it has just been updated with the THE 2019 rankings). You clearly have no evidence to offer and are wasting our time posting whatever links Google throws up. Robminchin (talk) 05:30, 1 October 2018 (UTC)Reply
(Just FYI, The reason I wrote that the website is outdated is because of its reporting Newsweek's Global Ranking on its results page[34][35]; Newsweek no longer exists.) So there is no reason to write I had no evidence for this or to discount my evidence.)
Robminchin, no one needs your approval as to what accounts for evidence on wiki. I am not sure if you have dealt with a grumpy peer reviewer, that is what you feel like to me right now. I have been trying to fit into your system by following your critiques, such as even creating a new wiki page for Reuter's rankings as that was part of your criticism. I have tried to fit into your system of using government resources for determining rankings, but this has been impossible. I have checked wikipedia principles and there is no where on wikipedia that says that government backed resources should be the decision factor for what is neutral, nor is this how other ranking templates have decided. The idea that you personally need to approve which government sources are appropriate creates bias, as you are accepting one Swiss .gov site while not others I have created from numerous sources including France, Hungary, World Economic Summit, World Government Summit, World Economic Forum, etc. No one has agreed that that Swiss site should be the arbiter of what is on this page rankings. Rather, wikipedia recommends per its core principle WP:5P4 to seek consensus. Consensus has been a core principle of wikipedia for forming decisions on wikipedia, therefore I have taken the various communities opinions from this site as to what are the best rankings to use and summarized them into a table below. That way everyone on this page gets a vote as to what rankings to use, rather than you or I deciding what is most appropriate based on your and EmyRussell's Swiss ranking site. I tried to use government sources per your idea, but that has been impossible, as you are not accepting any evidence other than what is self supporting towards liking the Leiden rankings for yourself. So consensus is the wiki principle to use in this case WP:5P4 and even you had advocated for me to seek consensus too. Anyone can add their opinion then.Mikecurry1 (talk) 01:24, 2 October 2018 (UTC)Reply
You are misconstruing what I said. You need reliable expert sources to back to that something is a major ranking. You provided press releases. To go thorough the "evidence" you claim to have presented so others can judge for themselves:
* France: A press release and a promotional brochure, neither of which are mention the ranking in question.
* Hungary: A link to a press release by the Belgian university that came top of the rankings.
* World Economic Summit: Never mentioned before now, no evidence offered.
* World Government Summit: An article produced by Thomson Reuters – not a third party source and audit disclaimer as not being the opinion of the organisers of the summit.
* World Economic Forum: An article "published in collaboration with Reuters" – not a third party source
You have also previously claimed that the ranking is used by other governmental bodies:
* European Commission: based on a PDF from Thomson Reuters about the ranking appearing on a public forum on their website. No evidence presented that the EC is actually using this ranking and the document is not a third party source.
* Singaporean government: a press release on the results of one of their universities.
* Texas: no evidence offered
* Swiss State Secretariat for Education: a news story from the international service of the Swiss Broadcasting Corporation, nothing to do with the secretariat in question.
In summary, you have offered zero independent expert assessments of the Innovation Ranking as a major ranking. That is why you are not getting anywhere. Robminchin (talk) 03:08, 2 October 2018 (UTC)Reply
These were major government sources and policy sources finding this ranking useful by publishing on it. That is the same as what one Swiss page did. We have no additional information anywhere on any of these sites for how these rankings are used including the Swiss site. Here is the board for Hungary's Office [36][37]. Here was a government website from Texas where they reference this ranking as a useful research tool [38] as well as a presentation by a university to the city of seattle citing this ranking [39]. Wikipedia:Verifiability From Reliable Source Neutrality - "Sources themselves do not need to maintain a neutral point of view. Indeed, many reliable sources are not neutral."
They are not "major government sources", they are press releases and links to releases. They are certainly not expert opinion. With reference to the articles by Reuters, not being a third-party source is completely different from not being neutral: these sources cannot be used to establish the importance of the ranking. Robminchin (talk) 05:24, 2 October 2018 (UTC)Reply
If my many govt sources are not acceptable, then yours should not be either for the Leiden ranking. They are the same, just website from government sources showing rankings. That's all they are.Mikecurry1 (talk) 06:05, 2 October 2018 (UTC)Reply
No, a government website with an academic advisory board is not the same as a government press release. Robminchin (talk) 06:31, 2 October 2018 (UTC)Reply
Like the Hungary one.Mikecurry1 (talk) 07:32, 2 October 2018 (UTC)Reply
All you have offered from that site is a "news and events" item with a link to a press release. There is no implication of expert assessment saying that this is a major ranking. Robminchin (talk) 15:49, 2 October 2018 (UTC)Reply

Change design

edit

I much prefer the design of the Template:Infobox US university ranking as for instance seen here: University of California, Berkeley#Rankings and reputation. It looks a lot better and has a nice National/Global split which makes it easier to read.. I believe we should re-design the UK template so that it looks similar, does anyone agree? Epi100 (talk) 16:45, 2 April 2014 (UTC)Reply

Seconded! AntiqueReader (talk) 07:40, 4 April 2014 (UTC)Reply
Good to hear! I would attemt to edit the template myself but it seems a bit out of my league.. Is there anyone else who could give it a try? Epi100 (talk) 15:44, 4 April 2014 (UTC)Reply
I'm happy to give it a go. Well, as in I'm going to copy the American one. xD AntiqueReader (talk) 09:04, 5 April 2014 (UTC)Reply
Sounds good :D Epi100 (talk) 14:28, 5 April 2014 (UTC)Reply
I agree too, the current design is overcomplicated. It could be much simpler representing the four major global rankings THE, QS, AWRU, and US News World Rankings.
Makes sense, I can give it a go to adjust the design, you are all right that it needs design changes. Simplicity is best. Per the design suggestions in the Talk section I will try to adjust per everyones suggestions, let me know what you think.

I agree that the way US ranking template presents the ideas is better. But just wonder why only ARWU was comprised into the national category but not QS and Times as well. By the way, I just added the World Reputation Rankings (national part) at the bottom of the world part, but maybe it should be transferred together with the corresponding ARWU, QS and THE. Biomedicinal (talk)

I don't quite understand what you mean by only ARWU being comprised? All three global rankings are now split into both global and national parts. I personally don't like the national part of the global rankings anyway and think they should be removed. Epi100 (talk) 12:54, 14 May 2014 (UTC)Reply
I agree a national ranking is pretty useless based off the Global rankings, and there is a national section for this; I do think though that that as global rankings are dominated by American universities, so it is very relevant to compare universities across Europe, and these global rankings could be used as a way to compare universities across Europe, such that amazing universities do poorly on global rankings against American universities, but are top notch universities in Europe and should be ranked accordingly. The UK rankings should reflect how these universities rank in Europe against other European universities too. An example would be UCL is ranked only 22nd globally, but it is ranked 4th in Europe. The numbers could be side by side, 22nd globally, 4th in Europe. — Preceding unsigned comment added by 2605:E000:6003:5900:608A:AB1D:A7CA:686E (talk) 15:57, 29 July 2015 (UTC)Reply
Um...I don't think it's necessary. In fact, people have been creating templates for institutions in their own countries to replace regional or even continental ones as they all run a different education system. Such country-based templates are more specific and directly useful for their nationals to compare the performance of selected universities. Overall and national ranks are well enough in my opinion. Full lists are always available on the pages in the citations if they want to know more. — Preceding unsigned comment added by 14.136.68.165 (talk) 16:42, 3 August 2015 (UTC)Reply
I agree a new design could probably improve on the current design.Mikecurry1 (talk) 09:16, 2 November 2017 (UTC)Reply

Per the Discussion Change design Above

edit

Per user Epi100, AntiqueReader, Biomedicinal, and an IP address discussion and recommendation for this template to be more inline with the Template:Infobox US university ranking, I have thus made a few design edits. The headings already state Global or National. Each university ranking does not need to state global or national below it a second time as that is redundant. Also per this pages talk discussion, I have included the US News rankings. Think the text font size could be increased or adjusted to be more inline with the Template:Infobox US university ranking now or further aesthetically pleasing. Please do improve or adjust this design edit further. Mikecurry1 (talk) 21:07, 24 September 2018 (UTC)Reply

Please start new discussions at the foot of the talk page. Robminchin (talk) 03:09, 25 September 2018 (UTC)Reply
Good Idea Robminchin! Moved this to the bottom of the page on implementing a design change from above.Mikecurry1 (talk) 18:16, 25 September 2018 (UTC)Reply
Mikecurry I do not appreciate how you have changed the design templates for this page without creating a discussion on this page. Citing discussions on the US university ranking template is irrelevant. This talk page discussion also voted against the inclusion of the US News ranking. EmyRussell (talk) 14:00, 27 September 2018 (UTC)Reply
Second what EmyRussell is saying. When making changes to the content of a highly visible template it is best to discuss here first. And also we don't need the template to be inline with the US template, they are separate templates for a reason. Aloneinthewild (talk) 18:52, 27 September 2018 (UTC)Reply
Sorry EmyRussell Aloneinthewild. This change was referencing a change higher up on the page that I could have done a better job of linking to that discussion titled "Change Design." I had alerted several of the authors from that discussion of this too so they could chime in. They had unanimous consensus in the section "Change Design" for an improved design. I had originally posted this on the talk page directly below that other discussion, and moved the discussion lower to the bottom of the page lower per the idea of Robminchin to start a new section at the bottom of the page. This was discussed higher on this talk page though in the section titled Change Design, where there was unanimous consensus for, and I cited the users involved in that discussion to alert them of this so that we could form a consensus for design updates. My design changes were minor, such as some things were clearly redundant like repeating global 4 times or national 4 times in the last design - so I think this is an improvement. A design change was originally requested by several users with consensus, and I was trying to start a discussion about implementing their idea. So I hope that changes any hard feelings.Mikecurry1 (talk) 05:35, 28 September 2018 (UTC)Reply
I for one greatly prefer the new design and thanks to Mikecurry1 for making the change. As you say, 1) there was plenty of support to improve/change the design (albeit from a while ago) further up in this talk section, and 2) the change is minor and aims to avoid repetition and present the information in a clearer way. The old design was both ugly and confusing, in my view Kiki 233 (talk) 08:53, 28 September 2018 (UTC)Reply
Yes, thank you Kiki 233. Glad you like the minor improvements of removing the repetitive text. I could have just better linked this discussion to the top of the page "change design" to avoid ill will from a design change so people could see it was very supported. Hopefully you all have ideas too on how to improve the design.Mikecurry1 (talk) 14:48, 28 September 2018 (UTC)Reply
For future reference, and this applies anywhere on Wikipedia, if you see "unanimous consensus" for a change in a years-old discussion and that change isn't reflected in the article, it means one of two things:
1. There wasn't actually a consensus – either you're missing reading it, or the discussion switched to a different thread (this is quite common).
2. There was consensus, but it has since changed to something different.
Either way, whatever consensus you think you see in an old discussion doesn't matter. You need to go back to the talk page and get a new consensus for whatever change is proposed. You can reference the earlier discussion as being the consensus reached at that time (at which point people may counter-reference other threads), but you can't cite it as a current consensus. You might also want to tag recent participants on the talk page and on the article, rather than those who were active at the time of the previous discussion. Robminchin (talk) 02:22, 29 September 2018 (UTC)Reply
No hard feelings were meant, apologies for any hard feelings again. Can be harder to edit when people are not all in the same room.Mikecurry1 (talk) 18:37, 29 September 2018 (UTC)Reply
I also greatly prefer the minor new design changes Mikecurry1 made. Not sure why there was some reversions, so I put one or two back. The reversions made the design uglier again. Kiki233 was right, 1) there was plenty of support to improve/change the design in this talk page section, and 2) the change presents the information in a simpler way. The old design was confusing. 75.83.33.227 (talk) 04:47, 18 October 2018 (UTC)Reply
IP I reverted your edits. You should read Robminchin's comment above. Clearly, I am against this. I see the year's as necessary information, I'm not sure how it is ugly or confusing. NB: It's interesting Kiki has made only one other edit this year, and then chooses to add to this discussion Aloneinthewild (talk) 20:54, 18 October 2018 (UTC)Reply
In regards to your reverts of the IP edit, your point is important regarding including the year as necessary information. I never meant using a year makes it ugly or confusing, perhaps that can and should be incorporated. What I meant was perhaps the year you do want can be incorporated on the side, without reverting design updates others liked. I personally was thinking the year may be redundant as it is already in the citation, therefore other world ranking templates do not have the year. I think people thought the old design was ugly and confusing, but it had nothing to do with including the year or not. I did not think Kiki233 commented on anything related to the year, while I may have. Yes, the year can and should be included.75.83.33.227

Ok I can agree with that. Again, its always good to come and discuss the issue first before making changes that are likely to be reverted. Aloneinthewild (talk) 10:52, 3 November 2018 (UTC)Reply

I do not agree with the proposed new changes. I am also weary of the 'different' IP address users who seem to be connected by their interest in Imperial College London and their repeated desire to change the design on this page. EmyRussell (talk) 23:10, 9 November 2018 (UTC)Reply
That is the same IP user, 75.83.33.227. It wrote two differing IP's in the last edit though I edited it twice from the same computer. I am very concerned that EmyRussell and Robminchin have control over this wiki page, even though there was a consensus by 4 users: Kiki233, Mikecurry1, AloneintheWild, and myself as well as plenty of support in the design change section who like or can agree with this. There is strong consensus for this from the change design section, so to claim there is no consensus is wrong. Moreover I am in agreement they have no control over which rankings to use either. There are other users on this page as well with valid opinions. Kiki233 wrote, "1) there was plenty of support to improve/change the design," so reverting (a complete rejection) of the opinions of others. I moved a section titled design change lower above this section, so you can also reference a lot of other people who were not satisfied with the design, wanting a design change, who have yet to chime in here. Please think of changes to improve the design rather than simply reverting without explanation provided, besides the words consensus, when there has been a clear consensus by 4 users, and many many users from b4 this year as well.2605:E000:141B:CACB:4CAA:1A86:DE63:3282 (talk) 00:27, 12 November 2018 (UTC)Reply
I appear to have been dragged into this by unwarranted accusations, despite having stayed our of the current dispute, so I find myself forced to give an opinion.
Firstly, the current dispute appears to be about where the year should go in the template. This was not discussed previously, so any claim as to consensus for a general design change does not apply to this specific discussion of what the design should be. There was consensus in the discussion between Aloneinthewild (talk · contribs) and the IP editor (which I agreed with, but which did not seem worth commenting on as it was already a consensus) to include the year, but that was all. The format in which the year should be included was not discussed.
Secondly, the links are to the articles about the rankings, not about the rankings in a specific year. My interpretation of the Manual of Style's instruction that "The article linked to should correspond to the term showing as the link as closely as possible given the context" (MOS:SPECIFICLINK) is that this means the year, not being part of the article linked to, should not be in the link.
Thirdly, there don't seem to be specific guidance in the MoS as to whether to put the year in parentheses. This is, however, the established style in this page, so should not, as per various ArbCom rulings, be changed unless there are substantial reasons for the change.
Taking all of these together, keeping the years as they are currently – in parentheses, outside of the wikilink – seems the best course of action. Robminchin (talk) 20:05, 12 November 2018 (UTC)Reply
(firstly) Glad you agreed with the inclusion of the year per my discussion with aloneinthewild. I also appreciate that you are able to see the community wanted changes to the design. Your point (secondly) - I removed the year from the corresponding link as per your suggestion to do that. Your points (thirdly) I put the year in parenthesis as you liked it that way. I think I was saying that the year should be on the side as when I was discussing with AloneintheWild. That made the template simpler as several of us liked it simpler from MikeCurry1's design. I have incorporated your suggestions Robminchin and how MikeCurry1 had the whole template on just one line line as the community wanted design changes. I think this was a good solution for design changes that the community wanted where everyone gets what they want from both including design changes and your feedback on how you would like the year to look with parenthesis and the year not in the link. Per the discussion right below it appears you were in consensus with MikeCurry1 about the year not being in the link too. So I hope everyone can win in this way and has their points agreed with.75.83.33.227 (talk) 05:01, 23 November 2018 (UTC)Reply
Glad we could come to agreement. I've gone ahead and moved the years back out of the links and updated the documentation on the reference names. Robminchin (talk) 06:30, 23 November 2018 (UTC)Reply
I also agree with these changes too (as well as Robminchins good ideas) as an improvement from the previous design, and glad Robminchin was open to working out any differences. If people fell further changes to the design would be good, happy to consider to implement those too. Mikecurry1 (talk) 20:54, 23 November 2018 (UTC)Reply
I like these changes, including Robminchin's last edit to make the year in black. I had another idea, depending on what you guys thought. Since the year is right next to the ranking now after these edits and looks cleaner after Robminchin's last edit, what if we were to have the ARWU link directly to the ARWU webpage rather than to another wikipage. Then we could remove the refernce so there is less reference text per line. This would look cleaner also, what do you guys think? It shows the year next to the ranking now, so I think it could look good. Thought I could get your opinions, instead of implementing a design change directly to allow for more opinions. Best, Mikecurry1 (talk) 21:11, 23 November 2018 (UTC)Reply
Wikipedia is generally against inline external links (WP:EXT), and we need the wikipage links so people can click through and find out details of the ranking on Wikipedia (which is more neutral and better practice than relying on information on the rankings' websites). It's probably best to keep the external links as references. Robminchin (talk) 00:24, 24 November 2018 (UTC)Reply
Nice edits, Zackmann08! Hope you are not opposed to every idea I suggest Robminchin, your reasoning seems solid here though. It is looking much better than before with these new design changes, thanks everyone! Mikecurry1 (talk) 20:22, 6 December 2018 (UTC)Reply

Remove year from reference names?

edit

At the moment the reference names have the year in them, e.g. <ref name="Academic Ranking of World Universities 2018>. This changes every year, meaning anywhere else in an article that used this reference name rather than repeating the reference would have to be updated. It would be simpler if we just used, to continue the above example, <ref name="Academic Ranking of World Universities">, which could be left unchanged, making it easier to use these references in articles. What do people think? Robminchin (talk) 02:57, 29 September 2018 (UTC)Reply

That's a good idea Robminchin! Yes, it is unneeded to update the references which is a hassle. Go for it. Mikecurry1 (talk) 18:14, 29 September 2018 (UTC)Reply

Which Rankings to Use By Consensus

edit

There has been a disagreement with the use of government sources to use, where only a Swiss website is the arbiter of what to include on this wikipedia. An Ip user wrote the past approach used "is the worst of all worlds." Rather, Wikipedia's five pillars has recommended to seek consensus WP:5P4. Everyone has a vote as to what university rankings to include that way with respect and dignity. This includes those who prefer to include rankings only on the Swiss website too which are included here.

Please add your name or adjust your opinion. Rankings can be determined by what is on this table. I have tried to tag everyone per Robminchin's idea so that everyone is included to form a new consensus.

@Robminchin, EmyRussell, Aloneinthewild, Biomedicinal, Kiki 233, Epi100, AntiqueReader, TSP, StarryGrandma, Derek328, DBailey635, Banaticus, and Leschnei:

University Ranking People For It People Against It No Opinion or
Evaluate Later
QS Robminchin, MikeCurry1, EmyRussell, 88.98.200.46 (preferred only QS, THE, ARWU) Against Here
THE Robminchin, MikeCurry1, EmyRussell, 88.98.200.46 Against Here
ARWU Robminchin, MikeCurry1, EmyRussell, 88.98.200.46 Against Here
Lieden EmyRussell, RobMinchin, AloneintheWild (slightly supportive, needs additional information) BioMedicinal, MikeCurry1, 2605:E000:6003:5900:DDFF:A270:DEF:B4C4, 88.98.200.46
US News BioMedicinal, MikeCurry1 Robminchin, EmyRussell,
Reuters Innovation MikeCurry1 Robminchin (So far), EmyRussell (So far)
Teaching Excellence Framework Robminchin, EmyRussell, MikeCurry1 Against Here

Mikecurry1 (talk) 02:07, 2 October 2018 (UTC)Reply


I have deleted a table that blatantly put words in other people's mouths. Follow the established method of such discussions on Wikipedia.
Oppose this concept of voting on what to include as being contrary to Wikipedia policy. See WP:NOTDEMOCRACY and Wikipedia:Polling is not a substitute for discussion. We should stay with requiring assessment as a major ranking by independent external experts for inclusion in this template. Robminchin (talk) 03:15, 2 October 2018 (UTC)Reply
I have restored my table for understanding and building consensus. WP:Content Removal WP:INAPPROPRIATE It was in part to settle our disagremeent where you have been asserting that the only evidence acceptable has been a swiss university guide page (you like), where you have been the arbiter of determining which external independent sources are approved of and which is not. This negated my opinion, and possibly others from this method. I am open to discussion as well. I have started a poll, which there is no problem with per your wiki link polling is not a substitute for discussion. I did not blatantly put words in people's mouths. I have summarized the opinions of people on this page into a table - which is useful. These were almost entirely clear opinions of preferences, including yours that oppose to certain rankings. Moreover, there are so many talk points above it is hard to understand who thinks which rankings should be included. It is a summary. I said people can change their opinions if they do not feel they are accurate, and pinged them (per your excellent idea). So we can govern this by consensus per wiki policy. Still open to discussion as well. Forgetting about any of our disagremeent. Perhaps it is better to get everyone's opinion as to what are the best university rankings to use to build a WP:Consensus. Mikecurry1 (talk) 03:58, 2 October 2018 (UTC)Reply
I removed the table because I felt this to fall under the category of removing harmful posts. The table claims to give the current opinions of people who have not participated in this thread or even in recent discussions on this page. Inclusion of "unverifiable speculation" is, in fact, the first item under WP:INAPPROPRIATE (although that is actually about article pages, not talk space). "Misrepresentation of other people" is also listed under Behavior that is unacceptable on talk pages. I recognise that removing such material is contentious so I will respect that you have decided to restore it.
I also object to the mischaracterisation in the introduction to this thread of what has been said on this page. I have consistently argued that we should follow the opinion of third party experts as to what constitutes a major ranking, not that we should follow a single website. While this is not a formal RfC, the introduction offered here is far from being the recommended "brief, neutral statement of or question about the issue".
Similarly, not wishing to put press releases in the same category of reliable sources as independent expert opinion is not "lack of accepting independent external evidence", it is entirely in keeping with WP: RELIABLE. This is what I have argued for throughout. Robminchin (talk) 05:05, 2 October 2018 (UTC)Reply
I was not trying to misrepresent anyone. I was trying to represent everyone honestly as best I could there current opinions gleaned from above conversation into a summary. I think it is pretty close to what people have stated on this board, honestly. For example, your current opinion for US News was that you do not want it on because there is no government backed source. You like the ranking. At the same time currently you vote against including it for that reason. I thus put you as a No for that column. Others such as biomedicinal have suggested to include that ranking, so I had put him as a yes. That was pretty clear he liked it previously. It is simply a summary, people can edit.
If you want to edit the introduction to this thread for the poll that is fine to be more acceptable to you and both of us, I am fine with that. I was noting our disagremeent in the reason for starting it. That may not be needed if you do not want it. We could possibly have a poll in a new thread for building consensus without this intro disagreement in there? Perhaps, that is better, and erase this thread.Mikecurry1 (talk) 05:39, 2 October 2018 (UTC)Reply
@Robminchin:I have created a duplicate of this thread without the introduction. If you feel that is better, you can erase this top thread which rankings to use by consensus, (if preffered.) (Please erase one of these two threads, so only one is present on the page). There is nothing bad about a summary of people's opinions we can build consensus from. This page has too much text generally to see what is happening anyways.Mikecurry1 (talk) 05:49, 2 October 2018 (UTC)Reply
The problem is that you shouldn't be "trying to represent everyone honestly as best [you] could". You may be trying to do this as honestly as possible, but you don't know (and can't know their current opinions). You don't know how they might be swayed by more recent discussions, or even if they care anymore. For those who have participated recently, the guideline is to "[be] precise in quoting others". If you want to have a poll, you should follow standard Wikipedia practice, where people give their own opinions or decline to participate, not make up a table where you include your interpretation of every opinion given at any point on this page and invite people to change it. You are ascribing opinions to people without any evidence that they currently hold those opinions.
Furthermore, as it says at WP:POLL, "A "vote" that doesn't seem to be based on a reasonable rationale may be completely ignored or receive little consideration, or may be escalated to wider attention if it appears to have been treated as a simple vote count. It is important therefore to also explain why you are voting the way you are." – without active participation in the discussion, or at the very least actively stating that they agree with sometime who is participating, those entries in the table are completely meaningless. Robminchin (talk) 06:28, 2 October 2018 (UTC)Reply
Asking for help from other users to resolve our conflict. Per dispute resolution. Thanks. Mikecurry1 (talk) 07:11, 2 October 2018 (UTC)Reply
@Mikecurry1: Your table has accurately represented my opinion on this matter. However, I, for similar reasons to Robminchin, do not believe that it is in Wikipedia's neutral interest to choose which ranking certain users would prefer. Reading through this talk page, it seems you have an agenda to push. EmyRussell (talk) 21:40, 4 October 2018 (UTC)Reply
Hi @EmyRussell:, I know you proposed that initial Swiss university site as the ranking source, and thus believe it should be used. I was open to this initially, and still could be. At the same time, I did not feel it was fair that this website is being the determinent of what global rankings to include on this wiki. I had proposed two other rankings the USNews and Reuters Innovation Rankings. (Others had proposed these before too.) I was open to either of them being possibly considered. It felt like my opinions and thoughts were not valid and others were determining which evidence was valid to support their own agendas. It did not seem fair that the Swiss ranking site which is outdated is determing which rankings to have on this wiki. When I tried to use the criteria of a government backed source (my sources were all legitimate .gov sources) they were not acceptable. As a scientist I thought this bias was unfair to me. No one has determined that government backed sources should be the criteria for determining what rankings to place on the template. i was open to government sources being used, but not when others are determining which evidence is considered legitamate. I never thought the Leiden rankings were a good ranking personally (per your and Robminchin's agenda) or there was full consensus around it. I had agreed with others about the problems with that ranking as it relates to Humanities universities (raised by biomedicinal) as well as its problems with reliability of its rankings between years and reliability compared with other rankings, and biases towards large institutions. An IP user wrote, "Worth noting also that Robminchin's preferred PP(top 10%) measurement places Rockefeller University ahead of MIT, Harvard and Stanford, puts Rice University ahead of Yale and Columbia, and the University of Exeter ahead of John Hopkins. In short, it is of highly questionable quality as a ranking leaving aside the above." I was open to lieden being used anyways as an option for universities to include on there wiki page (as you had proposed it and you both liked it), if we were also including other popular rankings as originally discussed. It seems other popular rankings are not being considered legitimately however, but based on this one swiss site. I personally feel the reuters innovative ranking adds a new dimension of innovation and basic/applied research societal impact not found in the other rankings - and is thus the most useful additional metric to consider. As the Vice Chancellor of Cardiff University states, "Innovation is central to its university strategy... We continue to build partnerships with industry, generate spin-outs, nurture academic enterprise and support student start-ups."[40]. I think more students are beginning to evaluate which universities to attend based on innovation (such as stanford surpassing harvard in some rankings now), with innovation becoming a part of university strategies too now. I also think while reuters methods are objective it generally has included many UK univeristies with its ranking based in the UK. At the same time I am open to the US News global ranking too. My agenda was that other major popular sources should be considered too, and they do not seem like they are really being considered legitimatelly, but based off this swiss site.Mikecurry1 (talk)
I have carried out a literature survey to see what rankings are actually being used by those working on this field, which should address your concerns about relying on a single website; results are listed in a new thread, but basically support the concept that Shanghai, THE, QS and Leiden (with the possible addition of NTU) are the main rankings. Choosing rankings on whether they match our preconceived notions ("PP(top 10%) measurement places Rockefeller University ahead of MIT, Harvard and Stanford, puts Rice University ahead of Yale and Columbia, and the University of Exeter ahead of John Hopkins") is not a valid way to proceed. I think I noted before that Shanghai treats humanities much worse than Leiden: my "smell test" (which I do not recommend as a criterion for inclusion) is how they rank the LSE: 26 in THE, 38 in QS, 53 in Leiden, 151–200 in Shanghai, 244= in US News, 701–750 in NTU – that Leiden treats specialist humanities institutions badly is not a valid criticism (at least not now, is quite possible their methodology has improved)), and is not a good criterion for inclusion in any case. Robminchin (talk) 23:43, 7 October 2018 (UTC)Reply
I like your idea for a literature review and think that can be fair. I think some of these rankings are newer such as US News Global and Reuters global, so using a literature review for this years literature studies could probably include them as possible candidates too. These rankings would not be in older studies, which was a same criticism to what I had of the Swiss website is that it was outdated. Sometimes science takes time to update their methodology too, because it takes time to publish a study. We could probably do this next year to see what are the current influential rankings too in the scientific literature. Mikecurry1 (talk) 01:58, 8 October 2018 (UTC)Reply


Usage of international rankings in the academic literature

edit

I've been investigating what rankings are used in the literature and other academic uses. I started out with a Google Scholar search for "university ranking" and identified papers that were using two or more identifiable international rankings (this normally meant mentioning them in their abstract unless the paper has been placed in an open repository). Many of the papers were from Scientometrics, so I did a search within this journal as well, and followed some citation trails. Overall, I found 15 papers meeting these criteria (with a few more I might be able to access from work) and one website of "learning resources". Adding to the universityrankings.ch website, this gives 17 expert sources.

I then counted up how often reach ranking was used. This showed three groups. The first group, consists of the Shanghai, THE and QS rankings, which are used by virtually everyone who wants to compare international rankings. The second group is two bibliometric rankings: Leiden and National Taiwan University (NTU); these are used in around half of the sources, with the Leiden ranking being the more popular of the two (used in more than half of the sources). The third group is the other rankings, which are seldom if ever used.

Removing the two websites and two sources that use only two rankings leaves 13 papers, with the groupings remaining unchanged except that the NTU ranking joins Leiden in being used by over half of the sources.

Based on this, the Shanghai, QS and THE rankings should definitely be included (I don't think this was under debate) and there is a strong argument for including the Leiden ranking. If a fifth ranking is to be added, it should be the NTU ranking rather than US News; no other ranking is used significantly in the literature.

DATA:

The International Rankings Expert Group (IREG) Inventory on International Rankings lists 17 general global rankings [41] – not used in adding up rankings, this is simply a list of all sensible rankings

OECD Economic Surveys: Norway (2016) – THE and Shanghai [42]

universityrankings.ch – THE, QS, Shanghai, Leiden listed as “main rankings”

Research and Publishing Support: University Rankings (Singapore Management University Research Guides) – QS, THE, Shanghai, Leiden “This page provides a select list of ranking services. However, the listing of items on this page in no way constitutes an endorsement of a ranking service by SMU Libraries or by the Singapore Management University.” [43]

Correlation among top 100 universities in the major six global rankings: policy implications (Scientometrics, 2016) – Shanghai, QS, THE, US News, NTU, URAP [44]

A comparative analysis of global and national university ranking systems (Scientometrics, 2015) – ARWU, HEEACT/NTU, Leiden, SCImago, QS, THE, URAP, Webometrics [45]; [46]

University Rankings and Social Science (European Journal of Education, 2013) – Shanghai, Leiden, QS, Scopus, THE, U‐Multirank [47]

International ranking systems for universities and institutions: a critical appraisal (BMC Medicine, 2007) – Shanghai, THE/QS [48]

A Review of Outcomes of Seven World University Ranking Systems (Iranian journal of Information Processing & Management, 2012) – THE, Shanghai, QS, 4International (uniRank), Webometrics, HEEACT (NTU), Leiden [49]

A comparative study on world university rankings: a bibliometric survey (Scientometrics, 2012) – ARWU, THE, PRSPWU (NTU), [50]

What the Overall doesn’t tell about world university rankings: examples from ARWU, QSWUR, and THEWUR in 2013 (Journal of Higher Education Policy and Management, 2013) – Shanghai, QS, THE [51]

Comparing university rankings (Scientometrics, 2010) – QS/THE, Shanghai, HEEACT/NTU, Webometrics, Leiden [52]

A critical comparative analysis of five world university rankings (Scientometrics, 2016) – Shanghai, Leiden, THE, QS, U-Multirank [53]

Country-specific determinants of world university rankings (Scientometrics, 2017) – QS, THE, Shanghai [54]

International university rankings: For good or ill? (HEPI Reports, 2016) – QS, THE, Shanghai [55]

Bibliometrics and University Research Rankings Demystified for Librarians (Library and Information Systems, 2014) – Shanghai, QS, THE, Leiden, NTU, SCImago listed as “some of the better-known rankings” [56]

Are university rankings useful to improve research? A systematic review (PLoS ONE, 2018) – Shanghai, CWUR, Leiden, QS, RUR, SCImago, THE, Reuters Innovation, U-Multirank, US News, URAP, Webometrics [57]

One size fits all? A different perspective on university rankings (Journal of Higher Education Policy and Management, 2016) – Shanghai, THE mentioned as the main two in the media, paper discusses these and QS, U-Multirank, Leiden, NTU. [58]

THE: 17
Shanghai: 17
QS: 15
Leiden: 10
NTU: 7
Webometrics: 4
U-Multirank: 3
URAP: 3
SCImago: 3

Robminchin (talk) 23:12, 7 October 2018 (UTC)Reply

Hi Robminchin, I think this is a fair approach to use a literature review. I think though that to address my consistent critism it needs to be from the current year (one year time frame) as they can be outdated depending on what year you chose for the meta-analysis. I am not sure when these publications were made. US News and Reuters Innovation may not have been invented when these publications were created, but may have gained influence since. Considering wiki editors including yourself, biomedicinal, and I have all previously proposed to include US News in this wiki template it would be surprising if it was not in any publication. It also takes scientists time to publish a paper, and to update their methodology. So I think it is a very fair method for a literature review you have thought up. If we could just do it for the current years studies that would address my criticism. (Also google scholar seems pretty neutral as you used, perhaps "world university rankings" filter since 2018. Think there should be an 's' in our search after rankings since some of the rankings use the word ranking and world is in THE, QS, and ARWU so will find our relevent criteria of global or world rankings. Could use global instead. "Since 2018" seems important to include too to possibly account for US News and Reuters not being created yet.) We could do this next year (one year time frame) too as a way to think about the rankings neutrally, as it may take additional time for scientists to become informed and update their methods for newer probably influential global rankings such as US News and Reuters. I am open to your general approach for a literature review though as that would help with neutrality doing this annually to help inform and decide on the rankings here too. Mikecurry1 (talk) 02:08, 8 October 2018 (UTC)Reply
I read the more recent PLOS ONE paper you had cited as it is a highly regarded peer reviewed journal in my field. The authors proposed an interesting conclusion I think is highly relevent to this discussion. The introduction wrote: "Concerns about reproducibility and impact of research urge improvement initiatives. Current university ranking systems evaluate and compare universities on measures of academic and research performance. Although often useful for marketing purposes, the value of ranking systems when examining quality and outcomes is unclear. The purpose of this study was to evaluate usefulness of ranking systems and identify opportunities to support research quality and performance improvement." The authors discussed this, "No single ranking system provides a comprehensive evaluation of research and academic quality. Utilizing a combined approach of the Leiden, Thomson Reuters Most Innovative Uni- versities, and the SCImago ranking systems may provide institutions with a more effective feedback for research improvement. Rankings which extensively rely on subjective reputation and “luxury” indicators, such as award winning faculty or alumni who are high ranking executives, are not well suited for academic or research performance improvement initiatives. Future efforts should better explore measurement of the university research performance through comprehensive and standardized indicators. This paper could serve as a general literature citation when one or more of university ranking systems are used in efforts to improve academic prominence and research performance." From this discussion, I think it may be good to consider both using the Leiden and Reuters rankings, they evaluate different things and may be complementary. Mikecurry1 (talk) 05:21, 8 October 2018 (UTC)Reply

Order

edit

Shouldn’t national rankings be above global rankings? I think it makes more sense and would bring it into line with the US university rankings template. 161.23.171.232 (talk) 13:12, 4 December 2018 (UTC)Reply

Either way around can make sense and I have no strong feelings either way, although national rankings do tend to have richer data and thus be more reliable, which could be an argument for putting them first. Note that while putting national rankings first would bring this template into line with the US template, it would bring it out of line with the Australian, Canadian and Indian templates, so this isn't really a good argument for a change. Robminchin (talk) 18:14, 4 December 2018 (UTC)Reply
Following on from an edit regarding the order, can editors gain consensus before making changes. I'm wondering if this page needs protection to avoid edit warring by IPs that do not engage in discussion. Aloneinthewild (talk) 22:40, 13 January 2019 (UTC)Reply
I'm inclined to agree with moving to National first. My impression is that national rankings are reasonably widely-accepted, as institutions within a country are reasonably comparable; and each institution tends to get broadly similar grades across the various rankings. Global rankings seem to be less well-accepted, as it is harder to compare institutions between very different systems; so they rely on fewer factors than national rankings, and accordingly often give results within each country that are at odds with how the same universities are ranked in national rankings. TSP (talk) 01:27, 14 January 2019 (UTC)Reply
I also agree with putting national above global. ClippednPinned (talk) 11:29, 14 January 2019 (UTC)Reply

That's over a month since this was proposed - there seem to be a few in favour, a few neutral and no real objections - I'd say we should go ahead with this, any last objections? TSP (talk) 17:22, 16 January 2019 (UTC)Reply

I think there's enough opinions to go ahead, no consensus is against this. Aloneinthewild (talk) 17:27, 16 January 2019 (UTC)Reply
Template reordered. TSP (talk) 11:48, 17 January 2019 (UTC)Reply

Requested move 3 February 2019

edit
The following is a closed discussion of a requested move. Please do not modify it. Subsequent comments should be made in a new section on the talk page. Editors desiring to contest the closing decision should consider a move review after discussing it on the closer's talk page. No further edits should be made to this section.

The result of the move request was: page moved as unopposed. (non-admin closure) Steel1943 (talk) 06:05, 12 February 2019 (UTC)Reply


Template:UK university rankingsTemplate:Infobox UK university rankings – Per Wikipedia:Manual of Style/Infoboxes#Consistency between infoboxes the naming of an infobox should start with the prefix "Infobox"; this is also WP:CONSISTENT with {{Infobox Australian university ranking}}, {{Infobox Japanese university ranking}}, {{Infobox US university ranking}} and {{Infobox world university ranking}}. Gonnym (talk) 09:50, 3 February 2019 (UTC)--Relisting. SITH (talk) 12:53, 10 February 2019 (UTC)Reply


The above discussion is preserved as an archive of a requested move. Please do not modify it. Subsequent comments should be made in a new section on this talk page or in a move review. No further edits should be made to this section.

Rankings to Use By Consensus

edit

Wikipedia's five pillars has recommended to seek consensus WP:5P4. Everyone has a say as to what university rankings to include that way with respect and dignity.

This table is our best summary of various opinions stated above.

Please add your name or adjust your opinion. I have tried to tag everyone per Robminchin's idea so that everyone is included to form a new consensus on rankings.

@Robminchin, EmyRussell, Aloneinthewild, Biomedicinal, Kiki 233, Epi100, AntiqueReader, TSP, StarryGrandma, Derek328, DBailey635, Banaticus, and Leschnei:

University Ranking Include Don't Include No Opinion or
Evaluate Later
QS 6 users Robminchin, MikeCurry1, EmyRussell, 88.98.200.46(preferred only QS, THE, ARWU), Music23123, 76.174.147.202 Against Here Leschnei, Banaticus
THE 6 users Robminchin, MikeCurry1, EmyRussell, 88.98.200.46, Music23123, 76.174.147.202 Against Here Leschnei, Banaticus
ARWU 6 users Robminchin, MikeCurry1, EmyRussell, 88.98.200.46, Music23123, 76.174.147.202 Against Here Leschnei, Banaticus
Lieden 2-3 users EmyRussell, RobMinchin, AloneintheWild (slightly supportive) 6 users BioMedicinal, MikeCurry1, 88.98.200.46, 2605:E000:6003:5900:DDFF:A270:DEF:B4C4, Music23123, 76.174.147.202 2-3 users Leschnei, AloneintheWild (wrote previous consensus to include leiden. wrote "I'm not really sure about the leiden." can evaluate and comment on later.), Banaticus
US News 3 users BioMedicinal, MikeCurry1, Music23123 2 users Robminchin (originally proposed it and likes ranking), EmyRussell 3 users Leschnei, Banaticus, 76.174.147.202
Reuters Innovation 3 users MikeCurry1, Music23123, 76.174.147.202 2 users Robminchin (So far), EmyRussell (So far) 2 users Leschnei, Banaticus
Teachers Excellence Framework 5 users Robminchin, EmyRussell, MikeCurry1, Music23123, 76.174.147.202 Against Here 2 users Leschnei, Banaticus

Mikecurry1 (talk) 02:07, 2 October 2018 (UTC)Reply

Sorry to be so unhelpful. My only contribution to this talk page was to ask a question about paywalls back in 2016, when I was a newbie. My vote would be completely uninformed. Leschnei (talk) 11:59, 2 October 2018 (UTC)Reply
Thanks for adding your opinion to this summary table Leschnei ! Mikecurry1 (talk) 02:22, 3 October 2018 (UTC)Reply
Thanks for tagging me but I made my edit to this page as a WikiGnome and don't really have an informed opinion as to what is the most proper source for the information in this template. Thanks again and good luck! :) Banaticus (talk) 23:32, 11 December 2018 (UTC)Reply
I added a vote tally count, and included my vote as well for which rankings to include.Music23123 (talk) 14:07, 15 June 2019 (UTC)Reply
I am also uninformed on all these rankings, so I added my student opinion to the list. From my student perspective Leiden is very complicated for students. I do not understand why when I go to Leiden's website it shows 'p' as the main ranking with 13 of 25 schools from china, where is Oxford and Cambridge on this list, they should be at the top. I needed to click around to get to 'pp10' and it is hard to find on the site. It seems arbitrary why 'pp10' and not how the Leiden website loads. So I have heard of the big 3 the, qs, and arwu. I also like the idea of using specialist rankings such as the teachers excellence framework and a creativity ranking as I like creativity - but I guess these are specialist rankings compared to overall university rankings. My student opinion is valuable, but my opinion is less informed also. 76.174.147.202 (talk) 06:07, 21 September 2020 (UTC)Reply
ORDER - I have moved a ranking voting table to the bottom of the page - voting to gain consensus so everyone's opinion is counted on which rankings to include is very important. Please vote, as your voice is important. I will request to archive the rest of the talk page so we can start fresh for 2020.2605:E000:151F:4A6B:8D99:8A3D:BFB7:798A (talk) 21:47, 24 September 2020 (UTC)Reply

Rankings to Use By Consensus - VOTE HERE - 2020

edit

I have started a fresh 2020 vote where everyone can vote for the rankings they prefer to display.

Please add your opinion and vote.

To vote add your name behind either INCLUDE RANKING (first column) or DO NOT INCLUDE RANKING (second column)

@Robminchin, EmyRussell, Aloneinthewild, Biomedicinal, Kiki 233, Epi100, AntiqueReader, TSP, StarryGrandma, Derek328, DBailey635, Banaticus, and Leschnei:

University Ranking INCLUDE RANKING DO NOT INCLUDE RANKING
QS 76.174.147.202
THE 76.174.147.202
ARWU 76.174.147.202
Lieden 76.174.147.202
US News
Reuters Innovation 76.174.147.202
Teachers Excellence Framework 76.174.147.202

76.174.147.202 (talk) 22:20, 24 September 2020 (UTC)Reply

Wikipedia proceeds by discussion and consensus, not by voting (see WP:Consensus and WP:VOTE. Rather than starting a poll, please explain why you think the current consensus from a couple of years ago should be changed and try to achieve consensus for that change. Robminchin (talk) 22:49, 24 September 2020 (UTC)Reply
I would further add that, as can be seen from earlier discussions on this page, the principle followed here has been to identify independent measures of which are the most important global rankings to include rather than to rely on the personal opinions of editors. Doing a literature survey (most recently in 2018) showed that, after the expected top three, the ranking most frequently cited in academic studies was the Leiden ranking. That is why it is currently included and various other ranking are not. Robminchin (talk) 23:00, 24 September 2020 (UTC)Reply
Good point on 1st comment consensus and vote.
Maybe for your second point start there and leave that up and then archive all older materials? That second point is good too and a nice place to start from. I was just trying to get a new consensus and start things fresh.
I think consensus changes over time. Some of the discussion dated to 2008 so were outdated. My idea was simply to visualize things new in 2020 with a fresh and current opinion, rather than a vote, thus the archive of all the older material (which was still accessible in an archive). I agree discussions are necessary and important. As a student, I know you are more of an expert at this so I was just adding my opinion and hoping others could share theirs in discussions and visually for consensus so we could look at the rankings from a fresh perspective.76.174.147.202 (talk) 22:20, 24 September 2020 (UTC)Reply
The 2008 discussion is something about switch statements that isn't relevant to discussions on inclusion and could be safely archived. Most of the relevant discussion dates from no earlier than 2015 (from a quick look-over) so it would probably be best to keep it on this page where it can be easily seen and referred to. Many of the points in the old discussion are still valid, and it's useful to see what the earlier arguments were.
To proceed more usefully, I would suggest identifying what are the modifications you would like to see (and don't think your opinion is any less valid because you're a student – I'm not an expert, just an editor with some experience who might be in danger of getting too stuck in my ways) and putting forward a case for those modifications. You should probably also tag people who have been involved in previous discussions but who might not be watching this page. Robminchin (talk) 23:46, 24 September 2020 (UTC)Reply
Yes, most of the earlier discussion before 2015 should be archived so this talk page can renew.
Well, I am not an expert at this. I think it is good to think about what students are using to choose schools. I see other students using mostly the THE to decide on schools and QS sometimes also. I do not see ARWU being as popular among students to decide schools, but it is still part of the big three. I have also not heard of my friends using leiden to choose a school. I think Reuters is useful for choosing a school as students like innovation now. What do you think about how Cambridge and Oxford show just the big 3 rankings (perhaps present the big 3)? or alternatively, to show the big 3 and let wiki schools choose if they want to include Leiden or Reuters also? thinking of ideas.
To note, behind Germany with 23 schools, The UK accounts for 21 schools in Reuters top 100. [59] [60] 76.174.147.202 (talk) 22:20, 24 September 2020 (UTC)Reply
I think one problem would be that we don't actually know what students are using, so this falls back on editor opinions. Another is that students are not the only people who use rankings, something like the Leiden ranking may be used more by research administrators (c.f. its use by the Swiss government) or by academics. The Reuters global ranking only includes 6 UK universities – we don't currently use regional rankings, although this could change. From a usefulness point of view, it appears to be volume-based such that a large university with a moderate rate of innovation would rank ahead of a small university with a high rate of innovation.
Having a standard template of rankings means that schools don't get to cherry-pick the rankings they are best in and allows universities to be compared easily. There's no way of policing this or enforcing it beyond arguing on individual pages, but in general the argument has been widely accepted. It would be quite easy to argue that ARWU is methodologically flawed and so shouldn't be included on the page of university X, for example, particularly when those methodological flaws lead to a low ranking for that university. We want to avoid that kind of situation by defining a common set that should be included. Robminchin (talk) 23:39, 25 September 2020 (UTC)Reply
Yes, I agree with having a standard template of rankings.
One reason for that proposal was that when I was reading about Lieden in the talk part above, one reason for Reuters not being included at that time was that it kept U of L as one school. I think this has expanded where individual schools in the University of London are evaluated now UCL, Kings, Manchester, etc. [61] You mentioned that you would like to evaluate it later. I also think similar to you it is a subject ranking rather than an overall school ranking, so perhaps near the Teacher's Excellence Framework at the end? 76.174.147.202 (talk
I agree with Robminchin. It would seem to me that a single rating system such as Leiden has been used precisely to avoid/discourage cherry-picking a particular ranking system. I don't think anyone has shown a reason why this should change at this time and I would be inclined to agree with past discussions Banaticus (talk) 13:59, 28 September 2020 (UTC)Reply
We haven't included any subject rankings here and I haven't seen any convincing argument as to why we should. What goes in an institutions rankings box is rankings of the whole institution. Robminchin (talk) 04:56, 16 October 2020 (UTC)Reply
That's an interesting point Robminchin. And a good argument. I agree the IP has addressed more of an opinion than an objective argument. So I will do my best to give a more objective reason for why I like Reuter's to discuss content. There are two main reasons for why I like Reuter's as a good ranking. The first is that the OECD and European Commission, Director General for Education have said universities should now have a "third mission" as a guiding framework [62][63] Universities have historically focused on two missions of "teaching" and "research", which are included in the rankings. A researcher from the University of Edinburgh has explained the third mission as "This widening of scope is reflected in the concept of the ‘entrepreneurial university’ wherein the university is transformed into... engaging in innovation, technology transfer and working with external organizations (Clark, 1998), such that there is an economic and social impact on society at large and the public funds that are used to pay for universities. [[64]][[65]] example: [66]. Therefore, I think as we are addressing the primary rankings of teaching and research, also addressing the third mission of economic and social development would make sense, but in the subject rankings below the Teacher's Education Framework. Reuter's Innovation Ranking is one way to address this third mission of universities, that we have not included yet. The second reason I like Reuter's as a subject ranking, is because I also agree with this article from PLOS ONE, which states that adding Reuter's Innovation alongside the Leiden would provide a more effective feedback for institutions and research improvement compared to luxury indicators focused on awards. [67] So I think primarily to address universities third mission it could be good as a subject indicator below the TEF. [68] I hope that is a more content and practical reason we could get behind of the universities third mission, rather than opinion.[69][70] Music23123 (talk) 20:36, 16 October 2020 (UTC)Reply
While coverage of the third mission is a good idea, the Reuters innovation ranking would be very narrow as a measure, only convering one of the seven aspects (IP and commercialisation) in the Knowledge Exchange Framework. It also has a very limited coverage of UK universities – only six are included (rising to still only 21 in the European ranking). So while the concept of covering the third mission is good, adding a ranking that only addresses a small fraction of the third mission for a very small number of universities (and without, as far as I can tell from their reports, any normalisation for size) does not seem a good way to do this. Robminchin (talk) 21:49, 16 October 2020 (UTC)Reply
Those are good points. I am glad you like and are supportive of the third mission of universities idea. As far as I am aware there is one other ranking that can measures the third mission which is the THE impact rankings: [71]. One thought is that we could have an impact section of the rankings (related to universities third mission of regional economic development and knowledge transfer back to the community). Perhaps, have the THE impact and Reuter's innovation in an IMPACT RANKING SECTION. I am a big fan of the idea of the THE impact ranking, but I think it has problems with the methodology compared to Reuters ranking. I agree that Reuters World innovative ranking has limited coverage of UK universities, such that using the world ranking as the main ranking would be bad. I think by using the Europe ranking that should suffice which includes a lot of uni's as much as many other rankings do, (for example the economist has a limit of 100 universities) In reading the methodology the European regional ranking limits it to schools that have 50 patents published I believe a year. So I agree that not having enough UK universities would not be acceptable for a ranking on the template, but with the European ranking introduced later, that may be solved now, which would be the superior ranking to analyze for inclusion. In terms of normalization I agree that is fair point too. I do wish their was more normalization, but we are not creating the rankings so can only use them. I think they say they are using web of science for some of normalization (which leiden uses the web of science for their ranking too). I think please check out this interesting paper by a scholar from the University of Manchester regarding the quality of the data in the Reuter's ranking. [72]. He describes how University rankers are the subject of much criticism, which is understandable. The US News in my opinion has the worst normalization of them all. Yet, the scholar from Manchester describes the data set being used by Reuters Innovation. In 2010 when THE split from QS, they used Thomson Reuter's as their data partner because of the hard quality of data compared to QS which used more reputation and subjective rankings to place Ivy Leagues higher in the chart, and a skewness towards science institutions. "Different rankings are oriented to different audiences over which these rankers want to exercise a certain expertise. The QS rankings, for instance, can generally be described as more (international) student-oriented. The Leiden university rankings are oriented towards readers who are more concerned with research performance measured by bibliometric output. The differences between rankers, their audiences, and the various professionals that support them point to the networked nature of the industry of rankings." "The point on data ownership eventually proved to be a critical one. Although Reuters’ data was the foundation of the industry-income-innovation indicator used in the THE World University Rankings until 2014, in the end, Thomson Reuters produced its own rankings. Thomson Reuters had long gathered patent and other industry-related data, which allowed it to continue thinking about ways to measure, or approximate, how innovative universities were in relation to their peers. The data company used its experience in ranking companies—it had previously compiled industry data and published a report on the world’s most innovative companies (Thomson Reuters 2014)—and applied these approaches to the higher education sector. The Reuters’ ranking focused on the data that was available or linked to academic papers and drew on data compiled by the Intellectual Property & Science business of Thomson Reuters. Reuters made clear that their ranking used their own proprietary data and analysis tools. The Thomson Reuters sister companies that provided this data included the Thomson Reuters Intellectual Property & Science and several of its research platforms, including InCites, the Web of Science Core Collection, Derwent Innovations Index, Derwent World Patents Index, and the Patents Citation Index. In Table 1, I catalog how each of these Thomson Reuters units contributed data that were utilized in each indicator of their innovative universities ranking. The THE could not produce an innovation ranking comparable to Thomson Reuters’ product, because it did not own the data." So Thomson Reuters was able to use the data to have the best innovation index and created their ranking. I think that is why the Leiden may be more for bibliographics, but the Thomson Reuters is also using hard data (also web of science) related to innovation and commercialization, that has been being analyzed since 2010. As the superior data set compared to THE related to innovation measures, and not reputation or subjective based, Leiden and Reuters are more effective together, where they each target a different audience. Just for your consideration. We all have critiques of different rankings, but they are meant for different audiences and together Leiden and Reuters would provide a more complete and effective picture for different audiences.Music23123 (talk) 20:49, 17 October 2020 (UTC)Reply
This doesn't really address the two major problems with using the Reuters Innovation ranking as a third mission measure.
Firstly, it has very poor coverage of UK universities. Only 21 are included, even in the European ranking, less than a third as many as the next smallest (ARWU, which has 65). As you say, other rankings such as the Economist also only have the top 100 universities - but we don't include them either so that's not really relevant.
Secondly, it only measures part of one of the seven dimensions of the third mission identified in the Knowledge Exchange Framework. This means it presents a very unbalanced view, it would be like using rankings of business schools to represent the entire university. As the old saying goes, 'bad data is worse than no data'.
So it doesn't really work either as a ranking for UK universities or as a measure of the third mission. It's also not clear what a European ranking tells us in terms of the third mission. If we want to include the third mission, it would probably be best to wait for the KEF results to come out (currently scheduled for December) and include those in the 'British Government assessment' section. This won't have full coverage (like the TEF) as higher education is a devolved issue, but would be far more comprehensive and inclusive than the Reuters Innovation ranking. Robminchin (talk) 23:35, 18 October 2020 (UTC)Reply

Leiden Optional 2022 - Publish or Perish - Ranking Assessment Reform in 2022

edit

Hi you all,

I just read an article from the EU about ranking assessment reform for 2022 [[73]] and the San Francisco Declaration on Research Assessment Reform in 2022 [[74]] signed by 22,000 people [75]. It seems times have changed since 2017 when a publish or perish mentality was the predominent way to evaluate a university. The publish or perish mentality is now becoming more outdated as a way to form decisions on primarily the journal impact factor as a form of evaluation. There has been many controversies with faculty over the overuse of this factor in hiring and firing decisions as a way of evalution.

There is now talk about ranking assessment reform to more holisitically evaluate a university on a broader variety of factors rather than relying on a single factor like journal impact. It is also being recommended to include qualitative assessments. Is it fair to many institutions or faculty to evaluate them or any single metric of research performance in a publish or perish mentality? It is now being recommended against this.

It means emphasising a broader range of factors for evaluating research quality of an institution, and it is recommended to greatly reduce any simplistic focus on journal impact factor, for a holistic approach to evaluating a university. Evaluating a university by a single research output factor, such as proportion of top 10 publications, rather than a broad array of metrics becomes worrysome. It becomes at best an incomplete evaluation of a university, and does not explain the whole Leiden rankings, which also has a wider look at research through three different lenses.

While, the number of publications in top 10 journals was used predominately before, it is now being recommended against this publish or perish mentality, which puts undue pressure on faculty, and is not a holistic assessment of any university research. It fails to capture a lot of nuance in any institution. Ideally, more holistic and richer approaches are needed to evaluate universities and it is now recommended to reduce the emphasis on journal impact factor for evaluating a universities performance, for a wider look at a university. In light of this new way of thinking about ranking assessment by the EU for 2022, and in the ranking assessment reform recommednations, I think we should consider making the Leiden optional or remove the Leiden. Best, Mikecurry1 (talk) 10:13, 23 February 2022 (UTC)Reply

Both of these initiatives are concerned with research assessment by funding bodies, not with ranking of universities by independent organisations. While similar, these are not the same thing. All rankings are incomplete evaluations of universities – but it shouldn't be the job of Wikipedia editors to decide which rankings are 'good' and which ones are 'bad'. Instead, we have included those most cited in literature looking at rankings (see earlier discussions). If some authoritative body were to use the principles outlined for research assessment to produce something like a recommended list of rankings to use, that would be something we can use, but if we try to decide which rankings to include based on our own application of the principles that would get dangerously close to being WP:OR. Robminchin (talk) 03:24, 26 February 2022 (UTC)Reply
That's a fair point, and I agree. It would make sense if there is some authoritative body that gives a recommended list of rankings to use. That would be ideal if something like that is being updated with these newer resesarch reforms. I guess we can follow up about that in time. I thought the san francisco declaration of research assessment reform signed by 22,000 people was quite influential in my thinking about on research assessment reform away from an emphasis on a single journal impact factor as a measure of instutitonal quality, and towards a broader array of metrics, including some qualitative factors as outlined to inform quality of an institution. A broader array of metrics seems a more authentic measure of institutional quality then an emphasis on a single journal impact factor which was originally produced to decide what journals a librarian should buy. An emphasis on a single journal impact factor was never meant for ranking colleges, or for hiring decisions, which college rankings are often used for. I did like your thought on an authoratitve body to help us decide. What should we consider as some authoritative body to help us decide? Are we looking at government lists (I am not sure how many of these there are there that are being updated)? For example, for student consideration the British Council only lists the THE and QS for world rankings to choose a uni by [76]. I could be open to that as an independent organization, to keep it simple. Then update the ranking recommendations as the British Council does. Are we looking at larger scale influential bodies that help guide students towards the right colleges (such as for example the popular independent organization study.eu [77], which lists rankings for all the UK uni's too)? What do you think are the criteria we should use for an authoritative body that we would be open to and think is fair for reconsidering what to include on this list? Best,Mikecurry1 (talk) 21:15, 1 March 2022 (UTC)Reply
What do you think about using the British Council's recommendation list of rankings to use for global. It would make sense to go with the British Council as an independent organization authoritative body in the UK, which gives proper guidance for how UK students should consider university rankings. The British Council is the UK organization whose goal is to guide students internationally towards further educational opportunities. They mention for global rankings using the THE and QS only, which are the UK's global rankings. The reasons the British Council suggested these global rankings was to take into account student satisfaction, teaching, facilities, extracurriculars, and career outcomes. [78]. This is not an exhaustive ranking list and we may not need one, as individual wiki editors arguing over which global rankings to include such as usnews, reuters, lieden, has caused edit wars. Like robminchin suggested, by using an independent UK authoritative organization suggested rankings for what to use, it would get around many of these challenges of wiki editors deciding which rankings are good or bad. There are not unlimited UK indepenent organizations that are autoritative body's like the British Council. If we can find another UK authoritative list we can consider it, but the British Council may be the best, as it is designed to guide prospective students for their futures. Perhaps, we should use the British Council's suggesting ranking list for global. Mikecurry1 (talk) 15:44, 20 March 2022 (UTC)Reply

Add REF

edit

Should the REF be added to this infobox under government assessment? See https://www.timeshighereducation.com/news/ref-2021-research-excellence-framework-results-announced. Itsallacademic (talk) 17:29, 14 May 2022 (UTC)Reply

I took the initiative and did this. Itsallacademic (talk) 07:16, 15 May 2022 (UTC)Reply

What would be included as the REF? There are at least three different rankings from THE and one from RPN (which may be identical to one of the THE rankings). None of these rankings is, in itself, a government assessment and shouldn't be listed here as such. They are derived from a government assessment, but the actual assessment gives a REF profile at the subject level for each submission, not institutional level rankings. Until this can be resolved I've removed the REF from the infobox. Robminchin (talk) 19:03, 15 May 2022 (UTC)Reply
Some fair points there. I was using THE's GPA measure, which seems to have become the "headline" figure used in university press releases, etc. Perhaps we could call it "REF (THE GPA)" in the infobox? Itsallacademic (talk) 06:32, 16 May 2022 (UTC)Reply
I think the headline figure in university press releases generally varies depending on which measure that university does best on! Certainly The Guardian and some other papers have used the Research Professional News research power measure (which looks to be identical to the THE market share measure from the Top 30 list, but the RPN methodology is paywalled so there may be subtle differences). Robminchin (talk) 07:42, 16 May 2022 (UTC)Reply
That was the case in 2014 for sure, but this time GPA seems to be more accepted as the standard measure, at least from what I've seen. Maybe I've not looked at enough universities' press releases though. Itsallacademic (talk) 16:48, 16 May 2022 (UTC)Reply
I'd say it's still very much the case. On a very quick look, UCL, King's and Cardiff all talk about 'research power' and don't mention GPA ranking, while Imperial is all about the GPA ranking. Nottingham mentions both, but leads with research power. Durham talks about the percentage of research classed as 3* or 4* (as do some of the others) and individual departments that made the top ten on GPA, but is silent on the institution-wide GPA. There simply isn't agreement on a single measure, so everyone is picking and choosing the one that makes them look best. Robminchin (talk) 01:01, 24 May 2022 (UTC)Reply
Yeah, itsallacademic raises some good points about inclusion of an REF score.
I think in terms of labels just a simple REF score, and label it "REF" or "Research Education Framework," in the same way there is a "TEF." What is actually included for the ranking can simply be a footnote, in the documentation, or on talk page of what to enter in. It becomes over technical for 95% of readers who do not follow the rankings. There are too many acronyms already that most readers barely know what they are. So we can figure out what to include behind the scenses in the documentation I think could be best.
Robminchin also makes a good point that every university is reporting their results differently, the results of highest percentage of 4* research for example, were probably never meant to be ranked in the first place, and yet everything is ranked now a days. A lot of it is advertising, so there are questions behind the use of it in that regard too (similar to the TEF point about advertising.) Perhaps we should discuss the methodology for what to include for an REF score. Mikecurry1 (talk) 14:12, 8 June 2022 (UTC)Reply
Here is the methdology:
"The data published today by the four UK funding bodies present the proportion of each institution’s Research Excellence Framework submission, in each unit of assessment, that falls into each of five quality categories.
For output and overall profiles, these are 4* (world-leading), 3* (internationally excellent), 2* (internationally recognised), 1* (nationally recognised) and unclassified (below nationally recognised or fails to meet the definition of research).
For impact, they are 4* (outstanding in terms of reach and significance), 3* (very considerable), 2* (considerable), 1* (recognised but modest) and unclassified (little or no reach or significance).
For environment, they are 4* (“conducive to producing research of world-leading quality and enabling outstanding impact, in terms of its vitality and sustainability”), 3* (internationally excellent research/very considerable impact), 2* (internationally recognised research/considerable impact), 1* (nationally recognised research/recognised but modest impact) and unclassified (“not conducive to producing research of nationally recognised quality or enabling impact of reach and significance”).
For the overall institutional table, Times Higher Education aggregates these profiles into a single institutional quality profile based on the number of full-time equivalent staff submitted to each unit of assessment. This reflects the view that larger departments should count for more in calculating an institution’s overall quality.
Institutions are, by default, ranked according to the grade point average (GPA) of their overall quality profiles. GPA is calculated by multiplying its percentage of 4* research by 4, its percentage of 3* research by 3, its percentage of 2* research by 2 and its percentage of 1* research by 1; those figures are added together and then divided by 100 to give a score between 0 and 4.
We also present research power scores. These are calculated by multiplying the institution’s GPA by the total number of full-time equivalent staff submitted, and then scaling that figure such that the highest score in the ranking is 1,000. This is an attempt to produce an easily comparable score that takes into account volume as well as GPA, reflecting the view that excellence is, to some extent, a function of scale as well as quality. Research power also gives a closer indication of the relative size of the research block grant that each institution is likely to receive on the basis of the REF results.
However, block grants are actually calculated according to funding formulas that currently take no account of any research rated 2* or below. The formula is slightly different in Scotland, but in England, Wales and Northern Ireland, the “quality-related” (QR) funding formula also accords 4* research four times the weighting of 3* research. Hence, we also offer a market share metric. This is calculated by using these quality weightings, along with submitted FTEs, to produce a “quality-related volume” score; each institution’s market share is the proportion of all UK quality-related volume accounted for by that institution.
So it seems GPA is calculated using this formula: "Institutions are, by default, ranked according to the grade point average (GPA) of their overall quality profiles. GPA is calculated by multiplying its percentage of 4* research by 4, its percentage of 3* research by 3, its percentage of 2* research by 2 and its percentage of 1* research by 1; those figures are added together and then divided by 100 to give a score between 0 and 4."
GPA has an advantage of overall impact of the research. So it does make sense in some regard, but it has its weakness in terms of size of the school.
Research power has an advantage of determining the size of the school, but does not take into account any 2* or 1* research, and also has major problems comparing schools across Scotland and the UK due to the way grants are calculated across Scotland and the UK, which is problematic. Due to these problems with research power, Times Higher Education offers a market share metric.
Market share seems complicated, and is meant to correct some of the flaws in the weightings for research power across Scotland and the UK. It is calculated by using these quality weightings, along with submitted FTEs, to produce a “quality-related volume” score; each institution’s market share is the proportion of all UK quality-related volume accounted for by that institution. So it supposebly takes into account the weighting issues across UK and Scotland in the research power metric. It is a little too complicated though and is artificial.
So while I have problems with Gpa's in education, I also can see the logic behind using what the times higher education reports as gpa as the default on their table tabletable.
It does make sense as to what itsallacademic mentioned in that is the default way of approaching the REF scores.
I am open to either not including this due to advertising purposes robminchin mentioned, or to discussing the methodology more of what to include, which can be done behind the scenes. I can be up for what you guys decide on including the REF or not.
It is also possible to have one REF score under the government ranking (such as the THE default), and then have a new seperate REF subject score textbox, such as is done in the US wiki's where they have a subject ranking textbox too. subject ranking medicine, sociology, etc. Mikecurry1 (talk) 19:52, 8 June 2022 (UTC)Reply
REF is not a ranking and taking one extrapolated measurement and placing it in a rankings box is subjective POV. What would make sense and would be helpful is a separate standardised REF table which includes a full set of data (number of staff, number of units, GPA, research power etc) and could be included in the research section of university articles.Simonebeltrasse (talk) 09:39, 9 June 2022 (UTC)Reply

Remove TEF

edit

While this was certainly valid when it was added, the Office for Students now says "These TEF awards were made under the initial TEF scheme and may not provide an up-to-date reflection of teaching quality. We have advised universities and colleges to stop advertising their TEF awards for this reason. We are developing a revised TEF scheme and currently aim to publish new TEF awards in early 2023." With this in mind, should we remove this from the rankings infobox until such time as up-to-date information is available? Robminchin (talk) 19:08, 15 May 2022 (UTC)Reply

It does seem like advertisement is an issue for these ref and tef things, removing it or not is fine for me. It could be nice to have REF and TEF, if we can figure it out, but we can take them out until a conclusion is reached also if preferred for undue advertising purposes. I am open either way. Mikecurry1 (talk)
Neither REF or TEF are rankings, not sure why either would be included in a rankings box. Rankings can be extrapolated based on REF outcomes, but there are many ways to rank using the data and it is subjective which to use (research power, GPA etc).Simonebeltrasse (talk) 09:33, 9 June 2022 (UTC)Reply
I've commented it out for now so we can easily restore it if and when up-to-date TEF awards are published. Robminchin (talk) 04:32, 22 June 2022 (UTC)Reply
Makes sense, I can agree and go along with that. The textbox is looking better, good improvement Robminchin (talk · contribs). Mikecurry1 (talk) 20:02, 8 July 2022 (UTC)Reply

2022 Rankings Used by UK Gov (Authoritative Body) - Rankings for World Universities List - Use These

edit

In the last discussion on updating the world university rankings, it was proposed by Robminchin to use an authoritative body, and not based on individual editors preferences, to decide which rankings to use, "If some authoritative body were to use the principles outlined for research assessment to produce something like a recommended list of rankings to use, that would be something we can use." Published a few days ago, a new high potential individual UK visa is using world rankings for UK visa's. "To qualify for a work visa, a person must have attended a university which appeared in the top 50 of at least two of the Times Higher Education World University Rankings (THE), the Quacquarelli Symonds World University Rankings (QS) or The Academic Ranking of World Universities (ARWU), in the year in which they graduated." "The list of eligible universities for a UK work visa is based on rankings from around the world." Therefore, as per recommended to use a UK authoritative body to decide which rankings to use for the UK, I propose also using only the THE, QS, and ARWU for the World rankings. I imagine it is a very reasonable solution to use the UK governments choice of rankings as an authoritative body for the UK, and would get rid of cherry picking individual rankings in the future (as robminchin suggested). Any thoughts? [79] [80]Mikecurry1 (talk) 17:54, 1 June 2022 (UTC)Reply

This seems reasonable to me. Robminchin (talk) 03:43, 2 June 2022 (UTC)Reply
Great, I thought so too. Per Robminchins suggestion, it would then get rid of the problem of individual editors cherry picking, and choosing their own rankings to use. So I thought this UK government ranking list of world rankings they are using, seemed quite reasonable to me too. If there are any other thoughts, please list them below.Mikecurry1 (talk) 03:50, 2 June 2022 (UTC)Reply
I will implement this now then to match the world rankings to those used by the UK government for a UK high potential individual visa, so the world rankings are set by an authoritative UK gov body, rather than cherry picking rankings by individual editors. [81]Mikecurry1 (talk) 19:49, 3 June 2022 (UTC)Reply

Removal of this infobox from multiple pages

edit

This infobox has recently been removed from multiple pages in what appears to be a POV fork. Rather than doing this, I invite Dr.AndrewBamford to come here and make the case for the inclusion of additional rankings and gain community consensus consensus for the changes. Robminchin (talk) 07:12, 23 June 2022 (UTC)Reply

I am again asking @Dr.AndrewBamford: to join discussions on which rankings to include on this page rather than trying to replace this template, and also to refrain from suggestions that the community consensus here on which rankings to include has been driven by personal gain. Robminchin (talk) 17:18, 23 June 2022 (UTC)Reply
I must say that I would prefer the change made, as none of those using the UK infobox are referenced. If you wish to use the UK template in articles then each entry should be supplied with a reference or the entry be removed. Keith D (talk) 17:57, 23 June 2022 (UTC)Reply
I'm confused by the comment above, as the template seems to be fully referenced. What am I missing? Jonathan A Jones (talk) 09:22, 25 June 2022 (UTC)Reply
It's always appeared as fully referenced to me. The references are contained in the template rather than in the code that appears in the page source, which might be causing the confusion, but it always appears fully referenced when displayed. Robminchin (talk) 21:08, 25 June 2022 (UTC)Reply
The reference just comes from the template not from the article and as such there should be an accessdate for each entry in each article so that you know when the change was made so that you can verify that the template reference has not been changed after you have accessed it. Keith D (talk) 21:18, 25 June 2022 (UTC)Reply

Inclusion of CWTS ranking, and national QS and ARWU rankings for comparability.

edit

CWTS at the least should not be missing from the infobox, hence the need for an alternative infobox. Dr.AndrewBamford (talk) 20:10, 23 June 2022 (UTC)Reply

The inclusion of CWTS Leiden was discussed extensively in the last few months. I've generally been in favour of its inclusion, but after it was not included among the major rankings used by the UK Government (see discussion above) this became untenable. We try, as much as possible, to follow external authorities rather than relying on the judgment of editors, in keeping with Wikipedia's principles. For CWTS Leiden to be reinstated requires evidence that it is considered alongside the currently-included three global rankings by authoritative external bodies. Robminchin (talk) 21:16, 25 June 2022 (UTC)Reply

Inclusion of Daily Mail in National Rankings for a Consensus

edit

Daily Mail with over 200 million subscribers now has a UK university rankings table. Should the daily mail ranking be added to the national rankings?

Here are the inclusion criteria for the daily mail league table compared to other league tables in a summary: [82]

I imagine as a major UK publication it should be included, I was just aiming to find a consensus.

I agree with the previous consensus that individual editors should not cherry pick rankings for this template themselves. - Mikecurry1 (talk)

No, the Daily Mail is deprecated as a source on Wikipedia – see WP:DAILYMAIL. As there have been two RfCs that concluded that it wasn't a reliable source and shouldn't be used, we can't override that decision with a discussion here. Robminchin (talk) 23:50, 7 March 2024 (UTC)Reply
That does all make sense. Due to the two rfc's on wp:dailymail and its reliability it shouldn't be on this table. Mikecurry1 (talk)