MEASURING THE IMPACT OF SCIENCE: BEYOND TRADITIONS. COMPARATIVE ANALYSIS OF MODERN SCIENTOMETRIC TOOLS AND THEIR ROLE IN DETERMINING SCIENTIFIC CONTRIBUTION
Keywords:
research impact; research assessment; research engagement; scientometrics; Citation analysis; Dimensions; Lens; Scilit; OpenAlex; Semantic Scholar; Statista; Opendatabot.Abstract
Assessing the quality and value of scholarly research can be crucial for individual researchers, academic institutions, and larger entities like networks, nations, regions, or industries. Scientific research typically gets assessed using a combination of quantitative (bibliometric and scientometric) and qualitative (expert) indicators, the latter of which mostly depends on citation analysis. Innovations that supplement traditional scientometric and bibliometric methodologies have surfaced in the last few decades in response to the difficulties presented by open science, especially in the areas of open data, open access, and open peer review. At the same time, there have been notable changes in the technological setting, including the adoption of open citation practices, standards including DOI and ORCID, and advances in artificial intelligence technologies such as scientific knowledge graphs. Modern cloud infrastructures and computational capacity make data more accessible and analysis more efficient if the data (and metadata) is properly prepared. Apart from traditional scientometric databases such as Web of Science and Scopus, the field has come to rely heavily on a number of powerful tools and initiatives including Dimensions, Lens, Scilit, OpenAlex, Crossref, Google Scholar, Semantic Scholar, OpenCitations, ScientoPy. The purpose of this article is to present an overview and comparison of a few different platforms and tools, with a focus on their responsible use for impact research and science evaluation. The study’s conclusions state that scientometric indicators should only be used as a supplement to expert evaluation, that they should be treated carefully, and that different services and tools should be employed to guarantee the multidimensionality and dependability of scientometric analysis and its application in assessing researchers and their work as well as forecasting research strategies. We are confident that the National Electronic Scientific Information System (URIS) and the Open Ukrainian Science Citation Index (OUCI) will continue to grow as a result of using the corresponding experience.
References
Horovyi, V. M. (2015). Kryterii yakosti naukovykh doslidzhen u konteksti zabezpechennia natsionalnykh interesiv [Criteria for the quality of scientific research in the context of ensuring national interests]. Visnyk Natsionalnoi akademii nauk Ukrainy, 6, 74-80 (in Ukr.).
Mryglod, O., & Nazarovets, S. (2019). Naukometriia ta upravlinnia naukovoiu diialnistiu: vkotre pro svitove ta ukrainske [Scientometrics and management of scientific activities: once again about the global and Ukrainian]. Visnyk Natsionalnoi akademii nauk Ukrainy, 09, 81–94. https://doi.org/10.15407/visn2019.09.081 (in Ukr.).
Pavliuk, K. V., & Kaminska, O. S. (2019). Zarubizhnyi dosvid otsinky yakosti naukovoi diialnosti [Foreign experience of assessing the quality of scientific activity]. Naukovi pratsi NDFI, 3, 25-40. https://doi.org/10.33763/npndfi2019.03.025 (in Ukr.).
Pylypenko, H. M., & Fedorova, N. Ye. (2020). Nauka yak faktor sotsialno-ekonomichnoho rozvytku suspilstva: monohrafiia [Science as a factor of socio-economic development of society: monograph]. Natsionalnyi tekhnichnyi universytet «Dniprovska politekhnika» (in Ukr.).
Yaroshenko, T. O., & Zharinova, A. H. (2023). Naukove tsytuvannia: istorychnyi i teoretychnyi landshaft [Scientific citation: historical and theoretical landscape]. Nauka ta naukoznavstvo, 3(121), 41-67. https://doi.org/10.15407/sofs2023.03.041 (in Ukr.).
Yaroshenko, T., Serbin, O., & Yaroshenko, O. (2022). Vidkryta nauka: rol universytetiv ta bibliotek u suchasnykh zminakh naukovoi komunikatsii [Open science: the role of universities and libraries in modern changes in scientific communication]. Tsyfrova platforma: informatsiini tekhnolohii v sotsiokulturnii sferi, 5(2), 277-292. https://doi.org/10.31866/2617-796X.5.2.2022.270132 (in Ukr.).
Yaroshenko, T., & Yaroshenko, O. (2020) Vysokotsytovani dokumenty naukovtsiv Ukrainy v bazakh danykh tsytuvan: koreliatsiia bibliometrychnykh indykatoriv [Highly cited documents of Ukrainian scientists in citation databases: correlation of bibliometric indicators]. Ukrainskyi zhurnal z bibliotekoznavstva ta informatsiinykh nauk, 5, 108-126. https://doi.org/10.31866/2616-7654.5.2020.205734 (in Ukr.).
Arabadzhieva, M., Vutsova, A., & Yalamov, T. (2023). In search of excellent research assessment. Baden-Baden: Nomos. https://doi.org/10.5771/9783748937203
Basson, I., Simard, M.-A., Ouangré, Z. A., Sugimoto, C. R., & Larivière, V. (2022). The effect of data sources on the measurement of open access: A comparison of Dimensions and the Web of Science. PLOS ONE, 17(3), e0265545. https://doi.org/10.1371/journal.pone.0265545
Bornmann, L. (2018). Field classification of publications in Dimensions: A first case study testing its reliability and validity. Scientometrics, 117 (637). https://doi.org/10.1007/s11192-018-2855-y
Bu, Y., Waltman, L., & Huang, Y. (2021). A multidimensional framework for characterizing the citation impact of scientific publications. Quantitative Science Studies, 2(1), 155–183. https://doi.org/10.1162/qss_a_00109
Curry, S., Gadd, E., & Wilsdon, J. (2022). Harnessing the Metric Tide: indicators, infrastructures & priorities for UK responsible research assessment. Report of The Metric Tide Revisited panel, December 2022. https://doi.org/10.6084/m9.figshare.21701624
Dardas, L. A., Sallam, M., Woodward, A., Sweis, N., Sweis, N., & Sawair, F. A. (2023). Evaluating Research Impact Based on Semantic Scholar Highly Influential Citations, Total Citations, and Altmetric Attention Scores: The Quest for Refined Measures Remains Illusive. Publications, 11(1), 5. https://doi.org/10.3390/publications11010005
Jiao, C., Li, K., & Fang, Z. (2023). How are exclusively data journalsindexed in major scholarly databases? an examination of the web of science, scopus, dimensions, and OpenAlex.. Ithaca: Cornell University Library, arXiv.org. https://doi.org/10.48550/arxiv.2307.09704
Heck, T. (2021). 8.2 Open Science and the Future of Metrics. In R. Ball (Ed.), Handbook Bibliometrics (pp. 507-516). Berlin, Boston: De Gruyter Saur. https://doi.org/10.1515/9783110646610-046
Herzog, C., Hook, D., & Konkiel, S. (2020). Dimensions: Bringing down barriers between scientometricians and data. Quantitative Science Studies, 1(1), 387-395. https://doi.org/10.1162/qss_a_00020
Hook, D. W., Porter, S. J., & Herzog, C. (2018). Dimensions: Building Context for Search and Evaluation. Frontiers in Research Metrics and Analytics, 3. https://doi.org/10.3389/frma.2018.00023
Kinney, R. M. et al. (2023). The Semantic Scholar Open Data Platform. ArXiv, abs/2301.10140.
Kramer, B. (2022). COKI Open metadata report (Update March 25, 2022). https://github.com/Curtin-Open-Knowledge-Initiative/open-metadata-report
Orduña-Malea, E., & Delgado-López-Cózar, E. (2018). Dimensions: redescubriendo el ecosistema de la información científica. El Profesional De La Información, 27(2), 420. https://doi.org/10.3145/epi.2018.mar.21
Porter, S. J., & Hook, D. W. (2022). Connecting Scientometrics: Dimensions as a Route to Broadening Context for Analyses. Frontiers in Research Metrics and Analytics, 7. https://doi.org/10.3389/frma.2022.835139
Priem, J., Piwowar, H., & Orr, R. (2022). OpenAlex: A fully-open index of scholarly works, authors, venues, institutions, and concepts. ArXiv. https://arxiv.org/abs/2205.01833
Ruiz-Rosero, J., Ramirez-Gonzalez, G. & Viveros-Delgado, J. (2019). Software survey: ScientoPy, a scientometric tool for topics trend analysis in scientific publications. Scientometrics, 121, 1165–1188. https://doi.org/10.1007/s11192-019-03213-w
Scheidsteger, T., & Haunschild, R.. (2023). Which of the metadata with relevance for bibliometrics are the same and which are different when switching from Microsoft Academic Graph to OpenAlex?. El Profesional De La Información. https://doi.org/10.3145/epi.2023.mar.09
McShea, Jo (2018). “Dimensions – a Game-Changing product launch from Digital”. Outsell. https://figshare.com/s/68dcc69f3fe6189098bb
Singh, V. K., Singh, P., Karmakar, M., Leta, J., & Mayr, P. (2021). The journal coverage of Web of Science, Scopus and Dimensions: A comparative analysis. Scientometrics, 126(6), 5113–5142. https://doi.org/10.1007/s11192-021-03948-5
Singh Chawla, Dalmeet (24 January 2022). Massive open index of scholarly papers launches. Nature. https://doi.org/10.1038/d41586-022-00138-y
Singh, P., Singh, V. K., & Piryani, R. (2023). Scholarly article retrieval from Web of Science, Scopus and Dimensions: A comparative analysis of retrieval quality. Journal of Information Science, 0(0). https://doi.org/10.1177/01655515231191351
Thelwall, M. (2018). Dimensions: A competitor to Scopus and the Web of Science? Journal of Informetrics, 12(2), 430–435. https://doi.org/10.1016/j.joi.2018.03.006
Visser, M., Van Eck, N. J., & Waltman, L. (2021). Large-scale comparison of bibliographic data sources: Scopus, Web of Science, Dimensions, Crossref, and Microsoft Academic. Quantitative Science Studies, 2(1), 20–41. https://doi.org/10.1162/qss_a_00112
Wang, K., Shen, Z., Huang, C., Wu, C.-H., Dong, Y., & Kanakia, A. (2020). Microsoft Academic Graph: When experts are not enough. Quantitative Science Studies, 1(1), 396-413. https://doi.org/10.1162/qss_a_00021