Scientometric Analysis Essay

Not to be confused with the academic journal of the same name.

Scientometrics is the study of measuring and analysing science, technology and innovation. Major research issues include the measurement of impact, reference sets of articles to investigate the impact of journals and institutes, understanding of scientific citations, mapping scientific fields and the production of indicators for use in policy and management contexts.[1] In practice there is a significant overlap between scientometrics and other scientific fields such as bibliometrics, information systems, information science and science of science policy.

Historical development[edit]

Modern scientometrics is mostly based on the work of Derek J. de Solla Price and Eugene Garfield. The latter created the Science Citation Index[1] and founded the Institute for Scientific Information which is heavily used for scientometric analysis. A dedicated academic journal, Scientometrics, was established in 1978. The industrialization of science increased the quantity of publications and research outcomes and the rise of the computers allowed effective analysis of this data.[2] While the sociology of science focused on the behavior of scientists, scientometrics focused on the analysis of publications.[1] Accordingly, scientometrics is also referred to as the scientific and empirical study of science and its outcomes.[3][4]

Later, around the turn of the century, evaluation and ranking of scientists and institutions came more into the spotlights. Based on bibliometric analysis of scientific publications and citations, the Academic Ranking of World Universities ("Shanghai ranking") was first published in 2004 by the Shanghai Jiao Tong University. Impact factors became an important tool to choose between different journals and the rankings such as the Academic Ranking of World Universities and the Times Higher Education World University Rankings (THE-ranking) became a leading indicator for the status of universities. The h-index became an important indicator of the productivity and impact of the work of a scientist. However, alternative author-level indicators has been proposed (see for example[5]).

Around the same time, interest of governments in evaluating research for the purpose of assessing the impact of science funding increased. As the investments in scientific research were included as part of the U.S. American Recovery and Reinvestment Act of 2009 (ARRA), a major economic stimulus package, programs like STAR METRICS were set up to assess if the positive impact on the economy would actually occur.[6]

Methods[edit]

Methods of research include qualitative, quantitative and computational approaches. The main foci of studies have been on institutional productivity comparisons, institutional research rankings, journal rankings [3][4][7] establishing faculty productivity and tenure standards,[8] assessing the influence of top scholarly articles,[9] and developing profiles of top authors and institutions in terms of research performance [10]

One significant finding in the field is a principle of cost escalation to the effect that achieving further findings at a given level of importance grow exponentially more costly in the expenditure of effort and resources. However, new algorithmic methods in search, machine learning and data mining are showing that is not the case for many information retrieval and extraction-based problems. Related fields are the history of science and technology, philosophy of science and sociology of scientific knowledge.

Journals in the field include Scientometrics, Journal of the American Society for Information Science and Technology, and Journal of Informetrics.[11] The International Society for Scientometrics and Informetrics founded in 1993 is an association of professionals in the field.

See also[edit]

References and footnotes[edit]

External links[edit]

  1. ^ abcLeydesdorff, L. and Milojevic, S., "Scientometrics" arXiv:1208.4566 (2013), forthcoming in: Lynch, M. (editor), International Encyclopedia of Social and Behavioral Sciences subsection 85030. (2015)
  2. ^De Solla Price, D., editorial statement. Scientometrics Volume 1, Issue 1 (1978)
  3. ^ abLowry, Paul Benjamin; Romans, Denton; Curtis, Aaron (2004). "Global journal prestige and supporting disciplines: A scientometric study of information systems journals". Journal of the Association for Information Systems. 5 (2): 29–80. SSRN 666145. 
  4. ^ abLowry, Paul Benjamin; Moody, Gregory D.; Gaskin, James; Galletta, Dennis F.; Humpherys, Sean; Barlow, Jordan B.; and Wilson, David W. (2013). "Evaluating journal quality and the Association for Information Systems (AIS) Senior Scholars’ journal basket via bibliometric measures: Do expert journal assessments add value?," MIS Quarterly (MISQ), vol. 37(4), 993–1012. Also, see YouTube video narrative of this paper at: https://www.youtube.com/watch?v=LZQIDkA-ke0&feature=youtu.be.
  5. ^Belikov, A.V.; Belikov, V.V. (2015). "A citation-based, author- and age-normalized, logarithmic index for evaluation of individual researchers independently of publication counts". F1000Research. 4: 884. doi:10.12688/f1000research.7070.1. 
  6. ^Lane, J (2009). "Assessing the Impact of Science Funding". Science. 324. 
  7. ^Lowry, Paul Benjamin; Humphreys, Sean; Malwitz, Jason; Nix, Joshua C (2007). "A scientometric study of the perceived quality of business and technical communication journals". IEEE Transactions on Professional Communication. 50 (4): 352–378. doi:10.1109/TPC.2007.908733. SSRN 1021608.  Recipient of the Rudolph Joenk Award for Best Paper Published in IEEE Transactions on Professional Communication in 2007.
  8. ^Dean, Douglas L; Lowry, Paul Benjamin; Humpherys, Sean (2011). "Profiling the research productivity of tenured information systems faculty at U.S. institutions". MIS Quarterly. 35 (1): 1–15. SSRN 1562263. 
  9. ^Karuga, Gilbert G.; Lowry, Paul Benjamin; Richardson, Vernon J. (2007). "Assessing the impact of premier information systems research over time". Communications of the Association for Information Systems. 19 (7): 115–131. SSRN 976891. 
  10. ^Lowry, Paul Benjamin; Karuga, Gilbert G.; Richardson, Vernon J. (2007). "Assessing leading institutions, faculty, and articles in premier information systems research journals". Communications of the Association for Information Systems. 20 (16): 142–203. SSRN 1021603. 
  11. ^"Journal of Informetrics". 

I conducted co-citation analysis at the author and journal levels, as described further below.

Author-based co-citation analysis.

I conducted co-citation analysis of authors who were co-cited across the four periods to offer a more holistic interpretation of the evolution of the field. In the analysis, I included only articles that had 20 or higher citations and included the name of the “first author” only to avoid overly cluttered maps, following the procedures suggested by Rodrigues et al. [45] and Waltman and Van Eck [43]. Fig 4 shows the co-citations for the period of 2008–2013, while S4–S6 Figs show the co-citations for the other three periods. Larger node and node labels reflect higher citations (and vice versa), while different color and adjacent nodes depict the clusters of topic themes that emerge.

The results show that two major clusters of author co-citation relations emerged in the 1990–1995 period (see S4 Fig): entrepreneurship-psychology (Gartner, Cooper, Birley, Brockhaus, Aldrich, and Hannan; green circles) and strategy-general management (Burgelman, Kanter, Porter, Mintzberg, Drucker, Miller, McMillan; red circles). These co-citation clusters resembled the four clusters identified in the topic mapping analysis above, although they were less refined (see S1 Fig and column 7 of Table 1).

Next, author co-citation relations in the 1996–2001 period revealed four co-citation clusters (see S5 Fig): entrepreneurship-innovation-psychology (Gartner, Cooper, MacMillan, Timmons and the group; red circles); strategy-innovation (Covin, Zahra, Dess, Miller, Mintzberg, Burgelman, Stevenson; green circles); strategy-economics (Porter, Eisenhardt, Williamson, Tushman, Barney and the group; blue circles); and organization-technology-innovation (Carroll, Hannan, Acs, Westhead, Storey, Rothwell, Shane and the group; yellow circles). In this period, new highly cited scholars emerged: Covin, Zahra, Miller, Williamson, Eisenhardt, Venkataraman, and Shane. These co-citation clusters resembled the 11 clusters identified in the topic mapping analysis above, although they were less refined (see S2 Fig and column 6 of Table 1).

The author co-citation relations in the 2002–2007 period revealed six co-citation clusters (see S6 Fig): entrepreneurship-psychology (Shane, Gartner, Cooper, Johannisson, Busenitz, Davidsson, Baron, McClelland, Venkataraman, Sarasvathy and the group; red circles); economics—innovation (Audretsch, Baumol, Evans, Reynolds, Acs, Storey, Kirzner, Teece, Nelson and the group; blue circles); institutions—network—technology—innovation—sociology (Aldrich, Dimaggio, Greenwood, Burt, Powell, Garud, Hannan, Tushman, Van de Ven and the group; yellow circles); international—entrepreneurship (McDougall, Oviatt, Knight, Johanson, Cavusgil; purple circles), strategy—technology—organization (Zahra, Miller, Covin, Lumpkin, Porter, Eisenhardt, Barney, Burgelman, Von Hippel, Alvarez and the group; green circles); and venture capital—finance—family business—cognition (Westhead, Shepherd, Wright, Jensen, Chrisman, Sharma, Ensley and the group; light blue circles). Among the most highly cited scholars in this period were Shane, Zahra, Aldrich, Gartner, Eisenhardt, Audretsch, McDougall, Oviatt, Baron, and Dimaggio. Those who emerged in this period but did not feature in the prior two periods was McDougall. These co-citation clusters resembled the 28 clusters identified in the topic mapping analysis above, although they were less refined (see S3 Fig and column 5 of Table 1).

The author co-citation relations in the 2008–2013 period, as shown in Fig 4, revealed very dense clustering patterns compared to the previous three periods (see S4–S6 Figs). Nine co-citation clusters emerged in this period: innovation—technology—venture—capital—institution (Klepper, Cohen, Lerner, Nelson, Zucker, Stuart and the group; purple circles); economics—innovation—networks (Audretsch, Acs, Parker, Baumol, Fritsch, Arenius, Bosma, Minniti, North, and the group; yellow circles); entrepreneurship—psychology—cognition—sociology—women (Shane, Gartner, Aldrich, Davidsson, Baron, Bandura, Sarasvathy, Baker, Shepherd, Busenitz, Kirzner, Brush, Krueger, and the group; blue circles); institution—organization—innovation—sociology—network (Dimaggio, Greenwood, Weick, Garud, Powell, Johannisson, Suddaby, Dorado, and the group; green circles); social entrepreneurship—narrative—education (Mair, Steyaert, Hjorth, Austin, Nicholls, Tracey, Dees, Jones, Chell, Peredo, and the group; also green circles), family business—strategy (Chrisman, Sharma, Sirmon, Schulze, and the group; teal blue circles); strategy—networks—capabilities—exploration (Zahra, Eisenhardt, Miller, Covin, Lumpkin, Teece, Barney, Kogut, McGrath and the group; red circles); marketing (Slater, Kohli, Narver, Day, Atuahene-Gima, Zhou, and the group; also red circles); and international—entrepreneurship (McDougall, Oviatt, Coviello, Johanson, Dunning, Jones, Knight, Madsen, Cavusgil, and the group; light blue circles).

As shown above, results from co-citation analysis at the author level alone provided rather crude clusters of topics and offered overlapping topics, and the results can be rather difficult to interpret and depend on the researcher’s subjectivity in classifying and labeling them into topics. Nevertheless, they did show themes that offered support to the topic mapping results.

Journal-based co-citation analysis.

To add more depth to the analysis, I conducted co-citation analysis based on journal sources of articles that had 20 or more citations (n = 945 articles) for the entire 1990–2013 period, using Van Eck’s Java-based VOSviewer techniques [33, 43]. Results are shown in Fig 5. The figure depicts a rather diverse and complex journal co-citation clusters: entrepreneurship—psychology (Journal of Business Venturing, Entrepreneurship Theory and Practice, Journal of Applied Psychology, Journal of Personality and Social Psychology; blue circles); management—organizations (Academy of Management Journal, Academy of Management Review, Administrative Science Quarterly, Organization Science; green circles); family business (Family Business Review; brown circles); economics—finance (Small Business Economics, American Economic Review, Entrepreneurship and Regional Development, Journal of Finance; red circles); technology—innovation (Research Policy, Management Science, Technovation, R&D Management; purple circles); strategy—management (Strategic Management Journal, Journal of Management, Harvard Business Review; yellow circles); international business—entrepreneurship (Journal of International Business Studies, International Business Review, Journal of International Marketing; light blue circles); and marketing—innovation (Journal of Marketing, Journal of Product Innovation Management, Industrial Marketing Management; yellow circles). The patterns of journal co-citations were not surprising, and resembled the more refined topic mapping results shown in Table 1.

Overlay visualization analysis of new and hot topics.

First, to produce “new topics” using the time-based overlay visualization, I plotted the entire topic mapping’s terms (or words) and clusters from 1990–2013, and I overlaid the base map with numerical information to depict new topics (and later, hot topics) in entrepreneurship research. I chose the year 2008 as the average midpoint at 1.0 of the scale (green). To visualize the new topics, the terms that appeared in the topic clusters were matched with the corresponding year of the article where the terms appeared. Newer topics were visualized using color ranging from yellow (relatively new) to red (the newest), while older topics were visualized from green (relatively old) to blue (the oldest), based on a normalized scale of 0–2. Thus, terms that were used more towards 2013 were shown in orange to red; while terms that were used more towards 1990 were shown in light to dark blue. This produces a color-based visualization of newer versus older publications. The result is shown in Fig 6, and the classification of topic clusters refers to the Table 1. Increasing trends in publications related to the following “new topics” were observed: institutional entrepreneurship, institutional logic, institutional theory (topic cluster #12), social entrepreneurship, narrative, discourse (topic cluster #31), poverty (topic cluster #35), business ethics (topic cluster #32), family business (topic cluster #27), internationalization and international entrepreneurship (topic cluster #16), and global entrepreneurship monitor and use of panel data (unclassified cluster).

Fig 6. Overlay map of “new topics” in entrepreneurship research (1990–2013).

The closer two terms are to each other, the stronger their relations. A normalized scale of 0–2 was used to indicate the newness of publications. Year 2008 was used as the mid-point (score 1). Terms that are used more towards 2013 are shown in orange to red, while terms that are used more towards 1990 are shown in light to dark blue. Each term occurs in at least 10 publications.

https://doi.org/10.1371/journal.pone.0190228.g006

Next, to produce “hot topics” using the citation-based overlay visualization, I plotted the terms (following [37]) with colored circles to reflect the average citation impact for the term. To visualize the hot topics (i.e., topics that appear in highly cited articles), I matched the terms that appeared in the topic clusters with the citation score of the article where the terms appeared. I corrected for the age of publications by dividing each publication’s number of citations by the average number of citations of all publications that appeared in the same year. This yielded a publication’s normalized score. Thus, a score of 1 means that a publication’s number of citation equals the average of all publications that appeared in the same field in the same year. The normalized citation scores of all publications in which the terms occurred were then averaged, after which a color scale that ranged from blue (0; the coldest) to green (midpoint of 1.0; relatively cold) and yellow (1; relatively hot), to red (2; the hottest) was used to plot the terms. Therefore, terms with a low average citation impact were marked blue, while terms with a high average citation impact were red. This produces a color-based visualization of hot (highly cited) versus cold (less cited) publications. The result is shown in Fig 7, and the classification of topic clusters refers to the Table 1. The “hot topics” included: institutional work, institutional logic, institutional entrepreneurship (topic cluster #12), opportunity discovery and recognition (topic cluster #5), international new venture, international business study (topic cluster #16), entrepreneurial orientation, innovativeness (topic cluster # 18), cognition, emotion, identity (a mix of topic clusters #1 and #34), top management team (topic cluster #46), strategic alliance (overlap of topic clusters #4 and #11), performance and profitability (topic cluster #2), and depth case study and conceptualization (unclassified cluster).

Fig 7. Overlay map of overall “hot topics” in entrepreneurship research (1990–2013).

The closer two terms are to each other, the stronger their relations. The size and color of a term indicates, respectively, the number of publications in which the term occurs and the average citation impact of these publications. A normalized scale of 0–2 was used to indicate the average citation impact of publications. Blue indicates a low citation impact, green a normal citation impact, and red a high citation impact. Each term occurs in at least 10 publications.

https://doi.org/10.1371/journal.pone.0190228.g007

0 comments

Leave a Reply

Your email address will not be published. Required fields are marked *