quarta-feira, 4 de maio de 2022

SciMago -2022: isomorphisms...

The Honourable Schoolboy 


https://www.scimagoir.com/compare.php?idps[]=6856 

Nb. Leitura e consulta OBRIGATÓRIA: https://www.scimagoir.com/methodology.php 

















Ranking Methodology

General considerations

The SCImago Institutions Rankings (SIR) is a classification of academic and research-related institutions ranked by a composite indicator that combines three different sets of indicators based on research performance, innovation outputs and societal impact measured by their web visibility.

It provides a friendly interface that allows the visualization of any customized ranking from the combination of these three sets of indicators. Additionally, it is possible to compare the trends for individual indicators of up to six institutions. For each large sector it is also possible to obtain distribution charts of the different indicators.

For comparative purposes, the value of the composite indicator has been set on a scale of 0 to 100. However the line graphs and bar graphs always represent ranks (lower is better, so the highest values are the worst).

SCImago Standardization: In order to achieve the highest level of precision for the different indicators, an extensive manual process of disambiguation of the institution’s names has been carried out. The development of an assessment tool for bibliometric analysis aimed to characterize research institutions involves  an  enormous  data  processing  task  related  to  the  identification  and  disambiguation  of institutions  through  the  institutional  affiliation  of  documents  included  in  Scopus. The objective of SCImago, in this respect, is twofold:

  1. Definition  and  unique  identification  of  institutions:  The  drawing  up  of  a  list  of  research institutions where every institution is correctly identified and defined. Typical issues on this task include institution's merge or segregation and denomination changes.
  2. Attribution of publications and citations to each institution. We have taken into account the institutional  affiliation  of  each  author  in  the  field  ‘affiliation’  of  the database.  We have developed a mixed system (manual and automatic) for the assignment of affiliations to one or more institutions, as applicable. As well as an identification of multiple documents with the same DOI and/or title.

Thoroughness in the identification of institutional affiliations is one of the key values of the guaranteed standardization process, in any case, the highest possible levels of disambiguation.

Institutions can be grouped by the countries to which they belong. Multinational institutions (MUL) which cannot be attributed to any country have also been included.

The institutions marked with an asterisk consist of a group of sub-institutions, identified by with the abbreviated name of the parent institution. The parent institutions show the results of all of their sub-institutions.

Institutions can be also grouped by sectors (Universities, Health, Government,… ).

For the ranking purposes, the calculation is generated each year from the results obtained over a period of five year ending two years before the edition of the ranking. For instance, if the selected year of publication is 2021, the results used are those from the five year period 2015-2019. The only exception is the case of web indicators which have only been calculated for the last year.

The inclusion criterion is that the institutions had published at least 100 works included in the SCOPUS database during the last year of the selected time period.

The source of information used for the indicators for innovation is PATSTAT database.

The sources of information used for the indicators for web visibility are Google and Ahrefs.

Unpaywall database is used to identify Open Access documents.

Altmetrics from PlumX metrics and Mendeley are used for Societal Factor.

The SIR is from now a LEAGUE TABLE. The aim of SIR is to provide a useful metric tool for institutions, policymakers and research managers for the analysis, evaluation and improvement of their activities, outputs and outcomes.

Best Quartile is obtained by the institution in its country comparing the quartiles based on the overall indicator, research factor, innovation factor and societal factor.

Indicators

Indicators are divided into three groups intended to reflect scientific, economic and social characteristics of institutions. The SIR includes both, size-dependent and size-independent indicators; that is indicators influenced and not influenced by the size of the institutions. In this manner, the SIR provides overall statistics of the scientific publication and other output of institutions, at the same time that enables comparisons between institutions of different sizes. It needs to be kept in mind that, once the final indicator has been calculated out of the combination of the different indicators (to which a different weigh has been assigned) the resulting values have been normalized on a scale of 0 to 100.

Score Indicators

Research FactorNormalized ImpactExcellence with LeadershipOutputScientific LeadershipNot own Journals OutputOwn JournalsExcellenceHigh Quality PublicationsInternational CollaborationOpen AccessScientific Talent PoolSocietal FactorAltmetricsInbound LinksWebsite SizeInnovation FactorInnovative KnowledgePatentsTechnological Impact
FACTORINDICATORWEIGHT
Research (50%)Normalized Impact (NI)13%
Excellence with Leadership (EwL)8%
Output (O)8%
Scientific Leadership (L)5%
Not Own Journals (NotOJ)3%
Own Journals (OJ)3%
Excellence (Exc)2%
High Quality Publications (Q1)2%
International Collaboration (IC)2%
Open Access (OA)2%
Scientific Talent Pool (STP)2%
Innovation (30%)Innovative Knowledge (IK)10%
Patents (PT)10%
Technological Impact (TI)10%
Societal (20%)Altmetrics (AM)10%
Inbound Links (BN)5%
Web Size (WS)5%

Research:

  1. Normalized Impact (Leadership Output) (NI): Normalized Impact is computed over the institution's leadership output using the methodology established by the Karolinska Institutet in Sweden where it is named "Item oriented field normalized citation score average". The normalization of the citation values is done on an individual article level. The values (in decimal numbers) show the relationship between an institution's average scientific impact and the world average set to a score of 1, --i.e. a NI score of 0.8 means the institution is cited 20% below world average and 1.3 means the institution is cited 30% above average (Rehn and Kronman, 2008; González-Pereira, Guerrero-Bote and Moya-Anegón, 2011; Guerrero-Bote and Moya-Anegón, 2012). Size-independent indicator.
  2. Excellence with Leadership (EwL): Excellence with Leadership indicates the amount of documents in Excellence in which the institution is the main contributor (Moya-Anegón, et al., 2013). Size-dependent indicator.
  3. Output (O): total number of documents published in scholarly journals indexed in Scopus (Romo-Fernández, et al., 2011; OECD, 2016). Size-dependent indicator.
  4. Not Own Journals Output (NotOJ): number of documents not published in own journals (published by the institution). Size-dependent indicator. Added in 2019 edition.
  5. Own Journals (OJ): number of journals published by the institution (publishing services). Size-dependent indicator. Added in 2019 edition.
  6. International Collaboration (IC): Institution's output produced in collaboration with foreign institutions. The values are computed by analyzing an institution's output whose affiliations include more than one country address (Guerrero-Bote, Olmeda-Gómez and Moya- Anegón, 2013; Lancho-Barrantes, Guerrero-Bote and Moya-Anegón, 2013; Lancho-Barrantes, et al., 2013; Chinchilla-Rodríguez, et al., 2010; 2012). Size-dependent indicator.
  7. High Quality Publications (Q1): the number of publications that an institution publishes in the most influential scholarly journals of the world. These are those ranked in the first quartile (25%) in their categories as ordered by SCImago Journal Rank (SJRII) indicator (Miguel, Chinchilla-Rodríguez and Moya-Anegón, 2011; Chinchilla-Rodríguez, Miguel, and Moya-Anegón, 2015). Size-dependent indicator.
  8. Excellence (Exc): Excellence indicates the amount of an institution’s scientific output that is included in the top 10% of the most cited papers in their respective scientific fields. It is a measure of high quality output of research institutions (SCImago Lab, 2011; Bornmann, Moya-Anegón and Leydesdorff, 2012; Bornmann and Moya-Anegón, 2014a;Bornmann et al., 2014b). Size-dependent indicator.
  9. Scientific Leadership (L): Leadership indicates the amount of an institution’s output as main contributor, that is, the amount of papers in which the corresponding author belongs to the institution (Moya-Anegón, 2012; Moya-Anegón et. al, 2013; Moya-Anegón, et al.,). Size-dependent indicator.
  10. Open Access (OA): percentage of documents published in Open Access journals or indexed in Unpaywall database. Size-independent indicator. Added in 2019 edition.
  11. Scientific Talent Pool (STP): total number of different authors from an institution in the total publication output of that institution during a particular period of time. Size-dependent indicator.

Innovation:

  1. Innovative Knowledge (IK): scientific publication output from an institution cited in patents. Based on PATSTAT (http://www.epo.org) (Moya-Anegón and Chinchilla-Rodríguez, 2015). Size-dependent.
  2. Technological Impact (TI): percentage of the scientific publication output cited in patents. This percentage is calculated considering the total output in the areas cited in patents, which are the following: Agricultural and Biological Sciences; Biochemistry, Genetics and Molecular Biology; Chemical Engineering; Chemistry; Computer Science; Earth and Planetary Sciences; Energy; Engineering; Environmental Science; Health Professions; Immunology and Microbiology; Materials Science; Mathematics; Medicine; Multidisciplinary; Neuroscience; Nursing; Pharmacology, Toxicology and Pharmaceutics; Physics and Astronomy; Social Sciences; Veterinary. Based on PATSTAT (http://www.epo.org) (Moya-Anegón and Chinchilla-Rodríguez, 2015). Size-independent.
  3. Patents (PT): number of patent applications (simple families). Based on PATSTAT (http://www.epo.org). Size-dependent.

Societal impact:

  1. Altmetrics (AM): Altmetrics indicator has been calculated over the 10% documents of the institutions (best documents regarding the normalized impact value). This indicator has two components:
    • PlumX Metrics (weigth: 70%): number of documents that have more than one mention in PlumX Metrics (https://plumanalytics.com). We consider mentions in Twitter, Facebook, blogs, news and comments (Reddit, Slideshare, Vimeo or YouTube)
    • Mendeley (weigth: 30%): number of documents that have more than one reader in Mendeley (https://www.mendeley.com).
    This indicator is size-dependent. Added in 2019 edition.
  2. Number of Backlinks (BN): number of networks(subnets) from which inbound links to the institution website came from. Data extracted from ahrefs database (https://ahrefs.com). Size-dependent.
  3. Web size (WS): number of pages associated to the institution’s URL according to Google (https://www.google.com) (Aguillo et al., 2010). Size-dependent.

Bibliography

  • Aguillo, I., Bar-Ilan, J., Levene, M., & Ortega, J. (2010). Comparing university rankings. Scientometrics, 85, 243-256.
  • Arencibia-Jorge, R., Vega-Almeida, R. L., Chinchilla-Rodríguez, Z., Corera-Álvarez, E., Moya-Anegón, F. (2012) Patrones de especialización de la investigación nacional sobre Salud”. Revista Cubana de Salud Pública 38 (5). http://dx.doi.org/10.1590/S0864-34662012000500007
  • Bornmann, L., De Moya Anegón, F., Leydesdorff, L. (2012) The new Excellence Indicator in the World Report of the SCImago Institutions Rankings 2011. Journal of Informetrics, 6 (2), pp. 333-335. http://dx.doi.org/10.1016/j.joi.2011.11.006
  • Bornmann, L., & Moya Anegón, F. (2014a). What proportion of excellent papers makes an institution one of the best worldwide? Specifying thresholds for the interpretation of the results of the SCImago Institutions Ranking and the Leiden Ranking. Journal of the Association for Information Science and Technology, 65 (4), 732-736.
  • Bornmann, L., Stefaner, M., Moya Anegón, F., &Mutz, R. (2014b). Ranking and mapping of universities and research-focused institutions worldwide based on highly-cited papers. A visualisation of results from multi-level models. Online Information Review, 38 (1), 43-58.
  • Chinchilla-Rodríguez, Z., B. Vargas-Quesada, Y. Hassan-Montero, A. González-Molina, and F. Moya-Anegón. (2010). “New Approach to the Visualization of International Scientific Collaboration” Information Visualization, 9(4):277-287. http://dx.doi.org/10.1057/ivs.2009.31 doi:
  • Chinchilla-Rodríguez, Z., Benavent-Pérez, M., Miguel, S., Moya-Anegón, F. (2012) “International Collaboration in Medical Research in Latin America and the Caribbean (2003-2007)”. Journal of the American Society for Information Science and Technology 63 (11), pp. 2223-2238. http://dx.doi.org/10.1002/asi.22669
  • Chinchilla-Rodríguez, Zaida, Zacca-González, Grisel, Vargas-Quesada, Benjamín, Moya-Anegón, Félix (2016). Benchmarking scientific performance by decomposing leadership of Cuban and Latin American institutions in Public Health. Scientometrics, 106(3), 1239–1264. http://dx.doi.org/10.1007/s11192-015-1831-z
  • González-Pereira, B., Guerrero-Bote,V., Moya-Anegón, F. (2010). A new approach to the metric of journal’s scientific prestige: The SJR indicator. Journal of Informetrics, 4(3), pp. 379–391. http://dx.doi.org/10.1016/j.joi.2010.03.002
  • Guerrero-Bote, V.P., Moya-Anegón, F. (2012) A further step forward in measuring journals' scientific prestige: The SJR2 indicator. Journal of Informetrics, 6 (4), pp. 674-688. http://dx.doi.org/10.1016/j.joi.2012.07.001
  • Guerrero Bote, V.P., Olmeda-Gomez, C., De Moya-Anegon, F. (2013) Quantifying the benefits of international scientific collaboration. Journal of the American Society for Information Science and Technology, 64 (2), pp. 392-404. http://dx.doi.org/10.1002/asi.22754
  • Lancho-Barrantes, B.S., Guerrero-Bote, V.P., de Moya-Anegón, F. (2013) Citation increments between collaborating countries. Scientometrics, 94 (3), pp. 817-831. http://dx.doi.org/1002/asi.22754
  • Lancho-Barrantes, B. S., Guerrero-Bote, V. P., Chinchilla-Rodríguez, Z., Moya-Anegón, F. (2012) Citation Flows in the Zones of Influence of Scientific Collaborations. Journal of the American Society for Information Science and Technology 63 (3), pp. 481-489. http://dx.doi.org/10.1002/asi.21682
  • Lopez-Illescas, C., de Moya-Anegón, F., Moed, H.F. (2011) A ranking of universities should account for differences in their disciplinary specialization. Scientometrics, 88 (2), pp. 563-574. http://dx.doi.org/10.1007/s11192-011-0398-6
  • Miguel, S., Chinchilla-Rodríguez, Z., Moya-Anegón, F. (2011) Open Access and Scopus: A New Approach to Scientific From the Standpoint of Access. Journal of the American Society for Information Science and Technology, 62 (6), pp. 1130-1145. http://dx.doi.org/ 10.1002/asi.21532
  • Moya-Anegón, F., Chinchilla-Rodríguez, Z., Vargas-Quesada, B., Corera-Álvarez, E., González-Molina, A., Muñoz-Fernández, F. J., Herrero-Solana, V. (2007) Coverage analysis of SCOPUS: a journal metric approach. Scientometrics 73 (1), pp. 57-58. http://dx.doi.org/ 10.1007/s11192-007-1681-4
  • Moed, H.F., Moya-Anegón, F., López-Illescas, C., Visser, M. (2011). Is concentration of university research associated with better research performance? Journal of Informetrics. 5 (4) 649-658. http://dx.doi.org/10.1016/j.joi.2011.06.003
  • Moya-Anegón, F. Liderazgo y excelencia de la ciencia española (2012) Profesional de la Información, 21 (2), pp. 125-128. http://dx.doi.org/10.3145/epi.2012.mar.01
  • Moya-Anegón, F., Guerrero-Bote, V.P., Bornmann, L., Moed, H.F. The research guarantors of scientific papers and the output counting: a promising new approach (2013) Scientometrics, 97 (2), pp. 421-434. http://dx.doi.org/10.1007/s11192-013-1046-0
  • Moya-Anegón, F. (dir.), Chinchilla-Rodríguez, Z. (coord.), Corera-Álvarez, E., González-Molina, A., Vargas-Quesada, B. (2013) Principales Indicadores Bibliométricos de la Actividad Científica Española: 2010. Madrid: Fundación Española para la Ciencia y la Tecnología.
  • Moya-Anegón, F. (dir.), Chinchilla-Rodríguez, Z. (coord.), Corera-Álvarez, E., González-Molina, A., Vargas-Quesada, B. (2013) Excelencia y liderazgo de la producción científica española 2003-2010. Madrid: Fundación Española para la Ciencia y la Tecnología.
  • Moya-Anegón, F., Chinchilla-Rodríguez, Z. Impacto tecnológico de la producción universitaria iberoamericana. En: La transferencia de la I+D, la innovación y el emprendimiento en las universidades. Educación Superior en Iberoamérica. Informe 2015. Santiago de Chile: Centro Interuniversitario de Desarrollo, 2015, p. 83-94.
  • OECD and SCImago Research Group (CSIC) (2015), Compendium of Bibliometric Science Indicators 2014, http://oe.cd/scientometrics.
  • Rehn C, Kronman U. (2008) Bibliometric handbook for Karolinska Institutet. Karolinska Institutet University Library. Version 1.05.
  • Romo-Fernández, L.M., Lopez-Pujalte, C., Guerrero Bote, V.P., Moya-Anegon, F. (2011). Analysis of Europe's scientific production on renewable energies. Renewable Energy, 36 (9), pp. 2529-2537. http://dx.doi.org/10.1016/j.renene.2011.02.001

Sem comentários: