The SCImago Institutions Rankings (SIR) is a classification of academic and research-related institutions ranked by a composite indicator that combines three different sets of indicators based on research performance, innovation outputs and societal impact measured by their web visibility.
It provides a friendly interface that allows the visualization of any customized ranking from the combination of these three sets of indicators. Additionally, it is possible to compare the trends for individual indicators of up to six institutions. For each large sector it is also possible to obtain distribution charts of the different indicators.
For comparative purposes, the value of the composite indicator has been set on a scale of 0 to 100. However the line graphs and bar graphs always represent ranks (lower is better, so the highest values are the worst).
SCImago Standardization: In order to achieve the highest level of precision for the different indicators, an extensive manual process of disambiguation of the institution’s names has been carried out. The development of an assessment tool for bibliometric analysis aimed to characterize research institutions involves an enormous data processing task related to the identification and disambiguation of institutions through the institutional affiliation of documents included in Scopus. The objective of SCImago, in this respect, is twofold:
- Definition and unique identification of institutions: The drawing up of a list of research institutions where every institution is correctly identified and defined. Typical issues on this task include institution's merge or segregation and denomination changes.
- Attribution of publications and citations to each institution. We have taken into account the institutional affiliation of each author in the field ‘affiliation’ of the database. We have developed a mixed system (manual and automatic) for the assignment of affiliations to one or more institutions, as applicable. As well as an identification of multiple documents with the same DOI and/or title.
Thoroughness in the identification of institutional affiliations is one of the key values of the guaranteed standardization process, in any case, the highest possible levels of disambiguation.
Institutions can be grouped by the countries to which they belong. Multinational institutions (MUL) which cannot be attributed to any country have also been included.
The institutions marked with an asterisk consist of a group of sub-institutions, identified by with the abbreviated name of the parent institution. The parent institutions show the results of all of their sub-institutions.
Institutions can be also grouped by sectors (Universities, Health, Government,… ).
For the ranking purposes, the calculation is generated each year from the results obtained over a period of five year ending two years before the edition of the ranking. For instance, if the selected year of publication is 2021, the results used are those from the five year period 2015-2019. The only exception is the case of web indicators which have only been calculated for the last year.
The inclusion criterion is that the institutions had published at least 100 works included in the SCOPUS database during the last year of the selected time period.
The source of information used for the indicators for innovation is PATSTAT database.
The sources of information used for the indicators for web visibility are Google and Ahrefs.
Unpaywall database is used to identify Open Access documents.
Altmetrics from PlumX metrics and Mendeley are used for Societal Factor.
The SIR is from now a LEAGUE TABLE. The aim of SIR is to provide a useful metric tool for institutions, policymakers and research managers for the analysis, evaluation and improvement of their activities, outputs and outcomes.
Best Quartile is obtained by the institution in its country comparing the quartiles based on the overall indicator, research factor, innovation factor and societal factor.