Scientometrics

From Wikipedia, the free encyclopedia

Scientometrics is the field of study which concerns itself with measuring and analysing scholarly literature. Scientometrics is a sub-field of bibliometrics. Major research issues include the measurement of the impact of research papers and academic journals, the understanding of scientific citations, and the use of such measurements in policy and management contexts.[1] In practice there is a significant overlap between scientometrics and other scientific fields such as information systems, information science, science of science policy, sociology of science, and metascience. Critics have argued that over-reliance on scientometrics has created a system of perverse incentives, producing a publish or perish environment that leads to low quality research.

Historical development[]

Scientometrics was introduced by Vasily Nalimov under its Russian name Naukometriya in 1969, which translates to “Scientometrics” in English.[2][3][4][5] Modern scientometrics is mostly based on the work of Derek J. de Solla Price and Eugene Garfield. The latter created the Science Citation Index[1] and founded the Institute for Scientific Information which is heavily used for scientometric analysis. A dedicated academic journal, Scientometrics, was established in 1978. The industrialization of science increased the quantity of publications and research outcomes and the rise of the computers allowed effective analysis of this data.[6] While the sociology of science focused on the behavior of scientists, scientometrics focused on the analysis of publications.[1] Accordingly, scientometrics is also referred to as the scientific and empirical study of science and its outcomes.[7][8]

The International Society for Scientometrics and Informetrics founded in 1993 is an association of professionals in the field.[9]

Later, around the turn of the century, evaluation and ranking of scientists and institutions came more into the spotlights. Based on bibliometric analysis of scientific publications and citations, the Academic Ranking of World Universities ("Shanghai ranking") was first published in 2004 by the Shanghai Jiao Tong University. Impact factors became an important tool to choose between different journals and the rankings such as the Academic Ranking of World Universities and the Times Higher Education World University Rankings (THE-ranking) became a leading indicator for the status of universities. The h-index became an important indicator of the productivity and impact of the work of a scientist. However, alternative author-level indicators have been proposed.[10][11]

Around the same time, interest of governments in evaluating research for the purpose of assessing the impact of science funding increased. As the investments in scientific research were included as part of the U.S. American Recovery and Reinvestment Act of 2009 (ARRA), a major economic stimulus package, programs like STAR METRICS were set up to assess if the positive impact on the economy would actually occur.[12]

Methods and findings[]

Methods of research include qualitative, quantitative and computational approaches. The main focus of studies have been on institutional productivity comparisons, institutional research rankings, journal rankings[7][8][13] establishing faculty productivity and tenure standards,[14] assessing the influence of top scholarly articles,[15] and developing profiles of top authors and institutions in terms of research performance.[16]

One significant finding in the field is a principle of cost escalation to the effect that achieving further findings at a given level of importance grow exponentially more costly in the expenditure of effort and resources. However, new algorithmic methods in search, machine learning and data mining are showing that is not the case for many information retrieval and extraction-based problems.[citation needed]

More recent methods rely on open source and open data to ensure transparency and reproducibility in line with modern open science requirements. For instance, the Unpaywall index and attendant research on open access trends is based on data retrieved from OAI-PMH endpoints of thousands of open archives provided by libraries and institutions worldwide.[17]

Common scientometric indexes[]

Indexes may be classified as article-level metrics, author-level metrics, and journal-level metrics depending on which feature they evaluate.

Impact factor[]

The impact factor (IF) or journal impact factor (JIF) of an academic journal is a measure reflecting the yearly average number of citations to recent articles published in that journal. It is frequently used as a proxy for the relative importance of a journal within its field; journals with higher impact factors are often deemed to be more important than those with lower ones. The impact factor was devised by Eugene Garfield, the founder of the Institute for Scientific Information (ISI).

Science Citation Index[]

The Science Citation Index (SCI) is a citation index originally produced by the Institute for Scientific Information (ISI) and created by Eugene Garfield. It was officially launched in 1964. It is now owned by Clarivate Analytics (previously the Intellectual Property and Science business of Thomson Reuters).[18][19][20][21] The larger version (Science Citation Index Expanded) covers more than 8,500 notable and significant journals, across 150 disciplines, from 1900 to the present. These are alternatively described as the world's leading journals of science and technology, because of a rigorous selection process.[22][23][24]

Acknowledgement index[]

An acknowledgement index (British English spelling[25]) or acknowledgment index (American English spelling[25]) is a method for indexing and analyzing acknowledgments in the scientific literature and, thus, quantifies the impact of acknowledgements. Typically, a scholarly article has a section in which the authors acknowledge entities such as funding, technical staff, colleagues, etc. that have contributed materials or knowledge or have influenced or inspired their work. Like a citation index, it measures influences on scientific work, but in a different sense; it measures institutional and economic influences as well as informal influences of individual people, ideas, and artifacts. Unlike the impact factor, it does not produce a single overall metric, but analyses the components separately. However, the total number of acknowledgements to an acknowledged entity can be measured and so can the number of citations to the papers in which the acknowledgement appears. The ratio of this total number of citations to the total number of papers in which the acknowledge entity appears can be construed as the impact of that acknowledged entity.[26][27]

Altmetrics[]

In scholarly and scientific publishing, altmetrics are non-traditional bibliometrics[28] proposed as an alternative[29] or complement[30] to more traditional citation impact metrics, such as impact factor and h-index.[31] The term altmetrics was proposed in 2010,[32] as a generalization of article level metrics,[33] and has its roots in the #altmetrics hashtag. Although altmetrics are often thought of as metrics about articles, they can be applied to people, journals, books, data sets, presentations, videos, source code repositories, web pages, etc. Altmetrics use public APIs across platforms to gather data with open scripts and algorithms. Altmetrics did not originally cover citation counts,[34] but calculate scholar impact based on diverse online research output, such as social media, online news media, online reference managers and so on.[35][36] It demonstrates both the impact and the detailed composition of the impact.[32] Altmetrics could be applied to research filter,[32] promotion and tenure dossiers, grant applications[37][38] and for ranking newly-published articles in academic search engines.[39]

Criticisms[]

Critics have argued that over-reliance on scientometrics has created a system of perverse incentives, producing a publish or perish environment that leads to low quality research.[40]

See also[]

Journals[]

References and footnotes[]

  1. ^ Jump up to: a b c Leydesdorff, L. and Milojevic, S., "Scientometrics" arXiv:1208.4566 (2013), forthcoming in: Lynch, M. (editor), International Encyclopedia of Social and Behavioral Sciences subsection 85030. (2015)
  2. ^ Nalimov, Vasily Vasilyevich; Mulchenko, B. M. (1969). ""Scientometrics." Studies of science as a process of information". Science. Moscow, Russia.
  3. ^ Garfield, Eugene (2009). "From the science of science to Scientometrics visualizing the history of science with HistCite software". Journal of Informetrics. 3 (3): 173–179. doi:10.1016/j.joi.2009.03.009. ISSN 1751-1577. Retrieved 15 May 2021.
  4. ^ Валеев, Д. Х.; Голубцов, В. Г. (2018). "Юридическая Наукометрия И Цивилистические Исследования" (in Russian): 45–57. Retrieved 15 May 2021. Cite journal requires |journal= (help)
  5. ^ Борисов, М. В.; Майсуразде, А. И. (2014). "Восстановление связей в научном рубрикаторе на основе кластеризации гетерогенной сети" (PDF). Retrieved 15 May 2021. Cite journal requires |journal= (help)
  6. ^ De Solla Price, D., editorial statement. Scientometrics Volume 1, Issue 1 (1978)
  7. ^ Jump up to: a b Lowry, Paul Benjamin; Romans, Denton; Curtis, Aaron (2004). "Global journal prestige and supporting disciplines: A scientometric study of information systems journals". Journal of the Association for Information Systems. 5 (2): 29–80. doi:10.17705/1jais.00045. SSRN 666145.
  8. ^ Jump up to: a b Lowry, Paul Benjamin; Moody, Gregory D.; Gaskin, James; Galletta, Dennis F.; Humpherys, Sean; Barlow, Jordan B.; and Wilson, David W. (2013). "Evaluating journal quality and the Association for Information Systems (AIS) Senior Scholars’ journal basket via bibliometric measures: Do expert journal assessments add value?," MIS Quarterly (MISQ), vol. 37(4), 993–1012. Also, see YouTube video narrative of this paper at: https://www.youtube.com/watch?v=LZQIDkA-ke0.
  9. ^ "About". International Society for Scientometrics and Informetrics. Retrieved 2021-01-18.
  10. ^ Belikov, A.V.; Belikov, V.V. (2015). "A citation-based, author- and age-normalized, logarithmic index for evaluation of individual researchers independently of publication counts". F1000Research. 4: 884. doi:10.12688/f1000research.7070.1. PMC 4654436.
  11. ^ Kinouchi, O. (2018). "A simple centrality index for scientific social recognition". Physica A: Statistical Mechanics and Its Applications. 491: 632–640. arXiv:1609.05273. Bibcode:2018PhyA..491..632K. doi:10.1016/j.physa.2017.08.072. S2CID 22795899.
  12. ^ Lane, J (2009). "Assessing the Impact of Science Funding". Science. 324.
  13. ^ Lowry, Paul Benjamin; Humphreys, Sean; Malwitz, Jason; Nix, Joshua C (2007). "A scientometric study of the perceived quality of business and technical communication journals". IEEE Transactions on Professional Communication. 50 (4): 352–378. doi:10.1109/TPC.2007.908733. S2CID 40366182. SSRN 1021608. Recipient of the Rudolph Joenk Award for Best Paper Published in IEEE Transactions on Professional Communication in 2007.
  14. ^ Dean, Douglas L; Lowry, Paul Benjamin; Humpherys, Sean (2011). "Profiling the research productivity of tenured information systems faculty at U.S. institutions". MIS Quarterly. 35 (1): 1–15. doi:10.2307/23043486. JSTOR 23043486. SSRN 1562263.
  15. ^ Karuga, Gilbert G.; Lowry, Paul Benjamin; Richardson, Vernon J. (2007). "Assessing the impact of premier information systems research over time". Communications of the Association for Information Systems. 19 (7): 115–131. doi:10.17705/1CAIS.01907. SSRN 976891.
  16. ^ Lowry, Paul Benjamin; Karuga, Gilbert G.; Richardson, Vernon J. (2007). "Assessing leading institutions, faculty, and articles in premier information systems research journals". Communications of the Association for Information Systems. 20 (16): 142–203. doi:10.17705/1CAIS.02016. SSRN 1021603.
  17. ^ Piwowar, Heather; Priem, Jason; Orr, Richard (2019-10-09). "The Future of OA: A large-scale analysis projecting Open Access publication and readership". doi:10.1101/795310. Cite journal requires |journal= (help)
  18. ^ Garfield, E. (1955). "Citation Indexes for Science: A New Dimension in Documentation through Association of Ideas". Science. 122 (3159): 108–11. Bibcode:1955Sci...122..108G. doi:10.1126/science.122.3159.108. PMID 14385826.
  19. ^ Garfield, Eugene (2011). "The evolution of the Science Citation Index" (PDF). International Microbiology. 10 (1): 65–9. doi:10.2436/20.1501.01.10. PMID 17407063.
  20. ^ Garfield, Eugene (1963). "Science Citation Index" (PDF). Science Citation Index 1961. 1: v–xvi. Retrieved 2013-05-27.
  21. ^ "History of Citation Indexing". Clarivate Analytics. November 2010. Retrieved 2010-11-04.
  22. ^ "Science Citation Index Expanded". Retrieved 2017-01-17.
  23. ^ Ma, Jiupeng; Fu, Hui-Zhen; Ho, Yuh-Shan (December 2012). "The Top-cited Wetland Articles in Science Citation Index Expanded: characteristics and hotspots". Environmental Earth Sciences. 70 (3): 1039. Bibcode:2009EES....56.1247D. doi:10.1007/s12665-012-2193-y. S2CID 18502338.
  24. ^ Ho, Yuh-Shan (2012). "The top-cited research works in the Science Citation Index Expanded" (PDF). Scientometrics. 94 (3): 1297. doi:10.1007/s11192-012-0837-z. S2CID 1301373.
  25. ^ Jump up to: a b [1]
  26. ^ Councill, Isaac G.; Giles, C. Lee; Han, Hui; Manavoglu, Eren (2005). "Automatic acknowledgement indexing: expanding the semantics of contribution in the CiteSeer digital library". Proceedings of the 3rd international conference on Knowledge capture. K-CAP '05. pp. 19–26. CiteSeerX 10.1.1.59.1661. doi:10.1145/1088622.1088627. ISBN 1-59593-163-5.
  27. ^ Giles, C. L.; Councill, I. G. (December 15, 2004). "Who gets acknowledged: Measuring scientific contributions through automatic acknowledgment indexing" (PDF). Proc. Natl. Acad. Sci. U.S.A. 101 (51): 17599–17604. Bibcode:2004PNAS..10117599G. doi:10.1073/pnas.0407743101. PMC 539757. PMID 15601767.
  28. ^ "PLOS Collections". Public Library of Science (PLOS). Altmetrics is the study and use of non-traditional scholarly impact measures that are based on activity in web-based environments
  29. ^ "The "alt" does indeed stand for "alternative"" Jason Priem, leading author in the Altmetrics Manifesto -- see comment 592
  30. ^ Haustein, Stefanie; Peters, Isabella; Sugimoto, Cassidy R.; Thelwall, Mike; Larivière, Vincent (2014-04-01). "Tweeting biomedicine: An analysis of tweets and citations in the biomedical literature". Journal of the Association for Information Science and Technology. 65 (4): 656–669. arXiv:1308.1838. doi:10.1002/asi.23101. ISSN 2330-1643. S2CID 11113356.
  31. ^ Chavda, Janica; Patel, Anika (30 December 2015). "Measuring research impact: bibliometrics, social media, altmetrics, and the BJGP". British Journal of General Practice. 66 (642): e59–e61. doi:10.3399/bjgp16X683353. PMC 4684037. PMID 26719483.
  32. ^ Jump up to: a b c Priem, Jason; Taraborelli, Dario; Groth, Paul; Neylon, Cameron (September 28, 2011). "Altmetrics: A manifesto (v 1.01)". Altmetrics.
  33. ^ Binfield, Peter (9 November 2009). "Article-Level Metrics at PLoS - what are they, and why should you care?" (Video). University of California, Berkeley.
  34. ^ Bartling, Sönke; Friesike, Sascha (2014). Opening Science: The Evolving Guide on How the Internet Is Changing Research, Collaboration and Scholarly Publishing. Cham: Springer International Publishing. p. 181. doi:10.1007/978-3-319-00026-8. ISBN 978-3-31-900026-8. OCLC 906269135. Altmetrics and article-level metrics are sometimes used interchangeably, but there are important differences: article-level metrics also include citations and usage data; ...
  35. ^ Mcfedries, Paul (August 2012). "Measuring the impact of altmetrics [Technically Speaking]". IEEE Spectrum. 49 (8): 28. doi:10.1109/MSPEC.2012.6247557. ISSN 0018-9235.
  36. ^ Galligan, Finbar; Dyas-Correia, Sharon (March 2013). "Altmetrics: Rethinking the Way We Measure". Serials Review. 39 (1): 56–61. doi:10.1016/j.serrev.2013.01.003.
  37. ^ Moher, David; Naudet, Florian; Cristea, Ioana A.; Miedema, Frank; Ioannidis, John P. A.; Goodman, Steven N. (2018-03-29). "Assessing scientists for hiring, promotion, and tenure". PLOS Biology. 16 (3): e2004089. doi:10.1371/journal.pbio.2004089. ISSN 1545-7885. PMC 5892914. PMID 29596415.
  38. ^ Rajiv, Nariani (2017-03-24). "Supplementing Traditional Ways of Measuring Scholarly Impact: The Altmetrics Way". hdl:10315/33652. Cite journal requires |journal= (help)
  39. ^ Mehrazar, Maryam; Kling, Christoph Carl; Lemke, Steffen; Mazarakis, Athanasios; Peters, Isabella (2018-04-08). "Can We Count on Social Media Metrics? First Insights into the Active Scholarly Use of Social Media". Proceedings of the 10th ACM Conference on Web Science. p. 215. arXiv:1804.02751. doi:10.1145/3201064.3201101. ISBN 9781450355636.
  40. ^ Weingart, Peter (2005-01-01). "Impact of bibliometrics upon the science system: Inadvertent consequences?". Scientometrics. 62 (1): 117–131. doi:10.1007/s11192-005-0007-7. ISSN 0138-9130. S2CID 12359334.

External links[]

Retrieved from ""