Geographic variation in social media metrics: an analysis of Latin American journal articles

Author: Juan Pablo Alperin

Purpose: This study aims to contribute to the understanding of how the potential of altmetrics varies around the world by measuring the percentage of articles with non-zero metrics (coverage) for articles published from a developing region (Latin America).

Design/methodology/approach: This study uses article metadata from a prominent Latin American journal portal, SciELO, and combines it with altmetrics data from and with data collected by author-written scripts. The study is primarily descriptive, focusing on coverage levels disaggregated by year, country, subject area, and language.

Findings: Coverage levels for most of the social media sources studied was zero or negligible. Only three metrics had coverage levels above 2%—Mendeley, Twitter, and Facebook. Of these, Twitter showed the most significant differences with previous studies. Mendeley coverage levels reach those found by previous studies, but it takes up to two years longer for articles to be saved in the reference manager. For the most recent year, coverage was less than half than what was found in previous studies. The coverage levels of Facebook appear similar (around 3%) to that of previous studies.

Research limitations/implications: The data used for some of the analyses was collected for a six month period. For other analyses, data was only available for a single country (Brazil).

Originality/value: The results of this study have implications for the altmetrics research community and for any stakeholders interested in using altmetrics for evaluation. It suggests the need of careful sample selection when wishing to make generalizable claims about altmetrics.

Citation: Juan Pablo Alperin, (2015) “Geographic variation in social media metrics: an analysis of Latin American journal articles”, Aslib Journal of Information Management, Vol. 67 Issue: 3, pp.289-304, doi: 10.1108/AJIM-12-2014-0176


Adapting sentiment analysis for tweets linking to scientific papers

Authors: Natalie Friedrich, Timothy D. Bowman, Wolfgang G. Stock, Stefanie Haustein

Abstract: In the context of altmetrics, tweets have been discussed as potential indicators of immediate and broader societal impact of scientific documents. However, it is not yet clear to what extent Twitter captures actual research impact. A small case study (Thelwall et al., 2013b) suggests that tweets to journal articles neither comment on nor express any sentiments towards the publication, which suggests that tweets merely disseminate bibliographic information, often even automatically. This study analyses the sentiments of tweets for a large representative set of scientific papers by specifically adapting different methods to academic articles distributed on Twitter. Results will help to improve the understanding of Twitter’s role in scholarly communication and the meaning of tweets as impact metrics.

Citation: Natalie Friedrich, Timothy D. Bowman, Wolfgang G. Stock, Stefanie Haustein. (2015). Adapting sentiment analysis for tweets linking to scientific papers. arxiv


Research data explored: an extended analysis of citations and altmetrics

Authors: Isabella Peters, Peter Kraker, Elisabeth Lex, Christian Gumpenberger, Juan Gorraiz

Abstract: In this study, we explore the citedness of research data, its distribution over time and its relation to the availability of a digital object identifier (DOI) in the Thomson Reuters database Data Citation Index (DCI). We investigate if cited research data “impacts” the (social) web, reflected by altmetrics scores, and if there is any relationship between the number of citations and the sum of altmetrics scores from various social media platforms. Three tools are used to collect altmetrics scores, namely PlumX, ImpactStory, and, and the corresponding results are compared. We found that out of the three altmetrics tools, PlumX has the best coverage. Our experiments revealed that research data remain mostly uncited (about 85 %), although there has been an increase in citing data sets published since 2008. The percentage of the number of cited research data with a DOI in DCI has decreased in the last years. Only nine repositories are responsible for research data with DOIs and two or more citations. The number of cited research data with altmetrics “foot-prints” is even lower (4–9 %) but shows a higher coverage of research data from the last decade. In our study, we also found no correlation between the number of citations and the total number of altmetrics scores. Yet, certain data types (i.e. survey, aggregate data, and sequence data) are more often cited and also receive higher altmetrics scores. Additionally, we performed citation and altmetric analyses of all research data published between 2011 and 2013 in four different disciplines covered by the DCI. In general, these results correspond very well with the ones obtained for research data cited at least twice and also show low numbers in citations and in altmetrics. Finally, we observed that there are disciplinary differences in the availability and extent of altmetrics scores.

Citation: Peters, I., Kraker, P., Lex, E. et al. (2016). Research data explored: an extended analysis of citations and altmetricsScientometrics 107: 723. doi:10.1007/s11192-016-1887-4


Tweets as impact indicators: Examining the implications of automated bot accounts on Twitter

Authors: Stefanie Haustein, Timothy D. Bowman, Kim Holmberg, Andrew Tsou, Cassidy R. Sugimoto, Vincent Larivière

Abstract: This brief communication presents preliminary findings on automated Twitter accounts distributing links to scientific papers deposited on the preprint repository arXiv. It discusses the implication of the presence of such bots from the perspective of social media metrics (altmetrics), where mentions of scholarly documents on Twitter have been suggested as a means of measuring impact that is both broader and timelier than citations. We present preliminary findings that automated Twitter accounts create a considerable amount of tweets to scientific papers and that they behave differently than common social bots, which has critical implications for the use of raw tweet counts in research evaluation and assessment. We discuss some definitions of Twitter cyborgs and bots in scholarly communication and propose differentiating between different levels of engagement from tweeting only bibliographic information to discussing or commenting on the content of a paper.

Citation: Stefanie Haustein, Timothy D. Bowman, Kim Holmberg, Andrew Tsou, Cassidy R. Sugimoto, Vincent Larivière. (2014). Tweets as impact indicators: Examining the implications of automated bot accounts on Twitter. arXiv


Scholarly Metrics Baseline: A Survey of Faculty Knowledge, Use, and Opinion about Scholarly Metrics

Authors: Dan DeSanto and Aaron Nichols

Abstract: This article presents the results of a faculty survey conducted at the University of Vermont during academic year 2014–2015. The survey asked faculty about: familiarity with scholarly metrics, metric-seeking habits, help-seeking habits, and the role of metrics in their department’s tenure and promotion process. The survey also gathered faculty opinions on how well scholarly metrics reflect the importance of scholarly work and how faculty feel about administrators gathering institutional scholarly metric information. Results point to the necessity of understanding the campus landscape of faculty knowledge, opinion, importance, and use of scholarly metrics before engaging faculty in further discussions about quantifying the impact of their scholarly work.

DeSanto D & Nichols A. (2017). Scholarly Metrics Baseline: A Survey of Faculty Knowledge, Use, and Opinion about Scholarly Metrics College & Research Libraries vol. 78 no. 2, pp 150-170 doi:10.5860/crl.78.2.150



How many scientific papers are mentioned in policy-related documents?

Authors: Robin Haunschild, Lutz Bornmann

Abstract: In this short communication, we provide an overview of a relatively newly provided source of altmetrics data which could possibly be used for societal impact measurements in scientometrics. Recently, Altmetric – a start-up providing publication level metrics – started to make data for publications available which have been mentioned in policy-related documents. Using data from Altmetric, we study how many papers indexed in the Web of Science (WoS) are mentioned in policy-related documents. We find that less than 0.5% of the papers published in different subject categories are mentioned at least once in policy-related documents. Based on our results, we recommend that the analysis of (WoS) publications with at least one policy-related mention is repeated regularly (annually). Mentions in policy-related documents should not be used for impact measurement until new policy-related sites are tracked.

Haunschild R & Bornmann L. (2016). How many scientific papers are mentioned in policy-related documents? An empirical investigation using Web of Science and Altmetric data. Preprint.


Scholarly use of social media and altmetrics: a review of the literature

Authors: Sugimoto C, Work S, Larivière V, Haustein S

Abstract: Social media has become integrated into the fabric of the scholarly communication system in fundamental ways: principally through scholarly use of social media platforms and the promotion of new indicators on the basis of interactions with these platforms. Research and scholarship in this area has accelerated since the coining and subsequent advocacy for altmetrics — that is, research indicators based on social media activity. This review provides an extensive account of the state-of-the art in both scholarly use of social media and altmetrics. The review consists of two main parts: the first examines the use of social media in academia, examining the various functions these platforms have in the scholarly communication process and the factors that affect this use. The second part reviews empirical studies of altmetrics, discussing the various interpretations of altmetrics, data collection and methodological limitations, and differences according to platform. The review ends with a critical discussion of the implications of this transformation in the scholarly communication system.

Citation: Sugimoto C, Work S, Larivière V, Haustein S. (2016). Scholarly use of social media and altmetrics: a review of the literature. JASIST


Grand challenges in altmetrics: heterogeneity, data quality and dependencies

Author: Haustein, Stefanie

Abstract: As uptake among researchers is constantly increasing, social media are finding their way into scholarly communication and, under the umbrella term altmetrics, were introduced to research evaluation. Fueled by technological possibilities and an increasing demand to demonstrate impact beyond the scientific community, altmetrics received great attention as potential democratizers of the scientific reward system and indicators of societal impact. This paper focuses on current challenges of altmetrics. Heterogeneity, data quality and particular dependencies are identified as the three major issues and discussed in detail with a particular emphasis on past developments in bibliometrics. The heterogeneity of altmetrics mirrors the diversity of the types of underlying acts, most of which take place on social media platforms. This heterogeneity has made it difficult to establish a common definition or conceptual framework. Data quality issues become apparent in the lack of accuracy, consistency and replicability of various altmetrics, which is largely affected by the dynamic nature of social media events. It is further highlighted that altmetrics are shaped by technical possibilities and depend particularly on the availability of APIs and DOIs, are strongly dependent on data providers and aggregators, and potentially influenced by technical affordances of underlying platforms.

Citation: Haustein, S. (2016). Grand challenges in altmetrics: heterogeneity, data quality and dependencies. Scientometrics doi: http://10.1007/s11192-016-1910-9


A Systematic Identification and Analysis of Scientists on Twitter

Authors: Qing Ke, Yong-Yeol Ahn, Cassidy R. Sugimoto

Abstract: Metrics derived from Twitter and other social media—often referred to as altmetrics—are increasingly used to estimate the broader social impacts of scholarship. Such efforts, however, may produce highly misleading results, as the entities that participate in conversations about science on these platforms are largely unknown. For instance, if altmetric activities are generated mainly by scientists, does it really capture broader social impacts of science? Here we present a systematic approach to identifying and analyzing scientists on Twitter. Our method can be easily adapted to identify other stakeholder groups in science. We investigate the demographics, sharing behaviors, and interconnectivity of the identified scientists. Our work contributes to the literature both methodologically and conceptually—we provide new methods for disambiguating and identifying particular actors on social media and describing the behaviors of scientists, thus providing foundational information for the construction and use of indicators on the basis of social media metrics.

Citation: Qing Ke, Yong-Yeol Ahn, Cassidy R. Sugimoto. (2016). A Systematic Identification and Analysis of Scientists on Twitter. Arxiv


An Automatic Method for Assessing the Teaching Impact of Books from Online Academic Syllabi

Authors: Kousha, Kayvan and Thelwall, Mike

Abstract: Scholars writing books that are widely used to support teaching in higher education may be undervalued due to a lack of evidence of teaching value. Whilst sales data may give credible evidence for textbooks, it may poorly reflect educational uses of other types of books. As an alternative, this article proposes a method to automatically search for mentions of books in online academic course syllabi based on Bing searches for syllabi mentioning a given book, filtering out false matches through an extensive set of rules. The method had an accuracy of over 90% based on manual checks of a sample of 2,600 results from the initial Bing searches. Over a third of about 14,000 monographs checked had one or more academic syllabus mention, with more in the arts and humanities (56%) and social sciences (52%). Low but significant correlations between syllabus mentions and citations across most fields, except the social sciences, suggest that books tend to have different levels of impact for teaching and research. In conclusion, the automatic syllabus search method gives a new way to estimate the educational utility of books in a way that sales data and citation counts cannot.

Citation: Kousa, K & Thelwall, M. (2015). An Automatic Method for Assessing the Teaching Impact of Books from Online Academic Syllabi. JASIST (67)12