Five principles for community altmetrics data

Author: Joe Wass

Abstract: These five principles are my answer to some of the difficulties and problems I have observed in the past couple of years. In that time I have been collecting the kind of data that altmetrics are built from, and talking and working with researchers. Altmetrics data is derived from the community. I think that community should continue to be at the heart of every step.

Citation: Wass, T. (2018). Five principles for community altmetrics data. Joe’s Blog.


Source: Joe’s Blog

Can We Count on Social Media Metrics? First Insights into the Active Scholarly Use of Social Media

Authors: Maryam Mehrazar, Christoph Carl Kling, Steffen Lemke, Athanasios Mazarakis, Isabella Peters

Abstract: Measuring research impact is important for ranking publications in academic search engines and for research evaluation. Social media metrics or altmetrics measure the impact of scientific work based on social media activity. Altmetrics are complementary to traditional, citation-based metrics, e.g. allowing the assessment of new publications for which citations are not yet available. Despite the increasing importance of altmetrics, their characteristics are not well understood: Until now it has not been researched what kind of researchers are actively using which social media services and why – important questions for scientific impact prediction. Based on a survey among 3,430 scientists, we uncover previously unknown and significant differences between social media services: We identify services which attract young and experienced researchers, respectively, and detect differences in usage motivations. Our findings have direct implications for the future design of altmetrics for scientific impact prediction.

Citation: Mehrazar, M. et al., (2018). Can We Count on Social Media Metrics? First Insights into the Active Scholarly Use of Social Media. ArXiv.


Source: ArXiv

Presença e reputação online de pesquisadores em redes sociais acadêmicas: implicações para a comunicação científica | Researchers’ social academic network profiles and online reputation

Authors: Ronaldo Ferreira de Araújo

Abstract:This paper reports the partial results of an exploratory research in progress that aims to investigate the phenomenon of online presence and reputation of researchers in academic social networks and its implications for scholarly communication. It discusses the theoretical and methodological aspects of its construction and presents preliminary data about the online presence of 822 researchers from the Federal University of Alagoas in the main academic networks:, ResearchGate, Mendeley and Zotero. Our results demonstrate that 63.9% of the researchers have a profile in at least one of the academic networks considered. ResearchGate (48.2%) and (39.3%) are well ahead of Mendeley (11.7%) and Zotero (0.5%). There seems to be a predilection for Exact and Earth Sciences researchers by ResearchGate and Applied Social Sciences by the AcademiaEdu. With the growing number of online communication channels available, it is essential for researchers to manage their online presence and reputation by integrating them into their scholarly communication practices.

Citation: Ronaldo Ferreira de Araújo (2017). ““Presença e reputação online de pesquisadores em redes sociais acadêmicas: implicações para a comunicação científica.” Pesquisa Brasileira em Ciência da Informação e Biblioteconomia, Vol. 12, issue: 2, pp.202-211.


Source: Figshare

Altmetrics and Archives

Author: Elizabeth Joan Kelly

Abstract: Altmetrics are an alternative to traditional measurement of the impact of published resources. While altmetrics are primarily used by researchers and institutions to measure the impact of scholarly publications online, they can also be used by archives to measure the impact of their diverse online holdings, including digitized and born-digital collections, digital exhibits, repository websites, and online finding aids. Furthermore, altmetrics may fill a need for user engagement assessments for cultural heritage organizations. This article introduces the concept of altmetrics for archives and discusses barriers to adoption, best practices for collection, and potential further areas of study.

Citation: Kelly, Elizabeth Joan (2017) “Altmetrics and Archives,” Journal of Contemporary Archival Studies: Vol. 4 , Article 1. Available at:


What do computer scientists tweet? Analyzing the link-sharing practice on Twitter

Authors: Marco Schmitt, Robert Jäschke

Abstract: Twitter communication has permeated every sphere of society. To highlight and share small pieces of information with possibly vast audiences or small circles of the interested has some value in almost any aspect of social life. But what is the value exactly for a scientific field? We perform a comprehensive study of computer scientists using Twitter and their tweeting behavior concerning the sharing of web links. Discerning the domains, hosts and individual web pages being tweeted and the differences between computer scientists and a Twitter sample enables us to look in depth at the Twitter-based information sharing practices of a scientific community. Additionally, we aim at providing a deeper understanding of the role and impact of altmetrics in computer science and give a glance at the publications mentioned on Twitter that are most relevant for the computer science community. Our results show a link sharing culture that concentrates more heavily on public and professional quality information than the Twitter sample does. The results also show a broad variety in linked sources and especially in linked publications with some publications clearly related to community-specific interests of computer scientists, while others with a strong relation to attention mechanisms in social media. This refers to the observation that Twitter is a hybrid form of social media between an information service and a social network service. Overall the computer scientists’ style of usage seems to be more on the information-oriented side and to some degree also on professional usage. Therefore, altmetrics are of considerable use in analyzing computer science.

Citation: Schmitt M, Jäschke R. (2017). What do computer scientists tweet? Analyzing the link-sharing practice on Twitter. PLOS ONE 12(6): e0179630.

Data Availability: All data are available from the Zenodo repository (DOI: 10.5281/zenodo.580587)


Laying the Groundwork for a New Library Service: Scholar-Practitioner & Graduate Student Attitudes Toward Altmetrics and the Curation of Online Profiles

Authors: Kathleen Reed, Dana McFarland, Rosie Croft

Abstract: Objective – In order to inform a library service related to creating and maintaining online scholarly profiles, we sought to assess the knowledge base and needs of our academic communities. Participants were queried about use, issues, and attitudes toward scholarly profile and altmetric tools, as well as the role librarians could play in assisting with the curation of online reputation. Methods – Semi-structured interviews with 18 scholar-practitioners and 5 graduate students from two mid-sized universities.

Citation: Reed, K. McFarland, D. Croft, R. (2016). Laying the Groundwork for a New Library Service: Scholar-Practitioner & Graduate Student Attitudes Toward Altmetrics and the Curation of Online Profiles. Evidence Based Library and Information Practice. 11(2) 87-96.



Geographic variation in social media metrics: an analysis of Latin American journal articles

Author: Juan Pablo Alperin

Purpose: This study aims to contribute to the understanding of how the potential of altmetrics varies around the world by measuring the percentage of articles with non-zero metrics (coverage) for articles published from a developing region (Latin America).

Design/methodology/approach: This study uses article metadata from a prominent Latin American journal portal, SciELO, and combines it with altmetrics data from and with data collected by author-written scripts. The study is primarily descriptive, focusing on coverage levels disaggregated by year, country, subject area, and language.

Findings: Coverage levels for most of the social media sources studied was zero or negligible. Only three metrics had coverage levels above 2%—Mendeley, Twitter, and Facebook. Of these, Twitter showed the most significant differences with previous studies. Mendeley coverage levels reach those found by previous studies, but it takes up to two years longer for articles to be saved in the reference manager. For the most recent year, coverage was less than half than what was found in previous studies. The coverage levels of Facebook appear similar (around 3%) to that of previous studies.

Research limitations/implications: The data used for some of the analyses was collected for a six month period. For other analyses, data was only available for a single country (Brazil).

Originality/value: The results of this study have implications for the altmetrics research community and for any stakeholders interested in using altmetrics for evaluation. It suggests the need of careful sample selection when wishing to make generalizable claims about altmetrics.

Citation: Juan Pablo Alperin, (2015) “Geographic variation in social media metrics: an analysis of Latin American journal articles”, Aslib Journal of Information Management, Vol. 67 Issue: 3, pp.289-304, doi: 10.1108/AJIM-12-2014-0176


Adapting sentiment analysis for tweets linking to scientific papers

Authors: Natalie Friedrich, Timothy D. Bowman, Wolfgang G. Stock, Stefanie Haustein

Abstract: In the context of altmetrics, tweets have been discussed as potential indicators of immediate and broader societal impact of scientific documents. However, it is not yet clear to what extent Twitter captures actual research impact. A small case study (Thelwall et al., 2013b) suggests that tweets to journal articles neither comment on nor express any sentiments towards the publication, which suggests that tweets merely disseminate bibliographic information, often even automatically. This study analyses the sentiments of tweets for a large representative set of scientific papers by specifically adapting different methods to academic articles distributed on Twitter. Results will help to improve the understanding of Twitter’s role in scholarly communication and the meaning of tweets as impact metrics.

Citation: Natalie Friedrich, Timothy D. Bowman, Wolfgang G. Stock, Stefanie Haustein. (2015). Adapting sentiment analysis for tweets linking to scientific papers. arxiv


Research data explored: an extended analysis of citations and altmetrics

Authors: Isabella Peters, Peter Kraker, Elisabeth Lex, Christian Gumpenberger, Juan Gorraiz

Abstract: In this study, we explore the citedness of research data, its distribution over time and its relation to the availability of a digital object identifier (DOI) in the Thomson Reuters database Data Citation Index (DCI). We investigate if cited research data “impacts” the (social) web, reflected by altmetrics scores, and if there is any relationship between the number of citations and the sum of altmetrics scores from various social media platforms. Three tools are used to collect altmetrics scores, namely PlumX, ImpactStory, and, and the corresponding results are compared. We found that out of the three altmetrics tools, PlumX has the best coverage. Our experiments revealed that research data remain mostly uncited (about 85 %), although there has been an increase in citing data sets published since 2008. The percentage of the number of cited research data with a DOI in DCI has decreased in the last years. Only nine repositories are responsible for research data with DOIs and two or more citations. The number of cited research data with altmetrics “foot-prints” is even lower (4–9 %) but shows a higher coverage of research data from the last decade. In our study, we also found no correlation between the number of citations and the total number of altmetrics scores. Yet, certain data types (i.e. survey, aggregate data, and sequence data) are more often cited and also receive higher altmetrics scores. Additionally, we performed citation and altmetric analyses of all research data published between 2011 and 2013 in four different disciplines covered by the DCI. In general, these results correspond very well with the ones obtained for research data cited at least twice and also show low numbers in citations and in altmetrics. Finally, we observed that there are disciplinary differences in the availability and extent of altmetrics scores.

Citation: Peters, I., Kraker, P., Lex, E. et al. (2016). Research data explored: an extended analysis of citations and altmetricsScientometrics 107: 723. doi:10.1007/s11192-016-1887-4