Skip to Main Content

Research Impact Metrics

Use altmetrics to:

Short for alternative metrics, altmetrics are a measure of web-based scholarly interaction. They aim to measure such things as how often research is tweeted, blogged, downloaded or bookmarked.

Their development can be seen as a response to the impact of social networking on the research environment. Use altmetrics to:

  • demonstrate dissemination of research findings beyond disciplinary boundaries
  • demonstrate media coverage of research of teaching including social media activity
  • demonstrate participation in social impact activities that span disciplines
  • demonstrate sustained evidence of media and community engagement with national and international impact

Alternative metrics can be used to compensate for some of the limitations of traditional research metrics as well as presenting a more well-rounded picture of research impact.

They should be used as a supplement to traditional metrics. Alternative measures can assist:

  • where citation databases do not provide coverage of a discipline
  • in assessing the impact of research outputs that do not attract traditional metrics, such as citations
  • where impact in non-scholarly environments is significant
  • in determining how research is being implemented in practice
  • in uncovering the influence of very recently published work

See the Altmetrics: a manifesto, posted on, for further explanation about the development of this approach to measuring scholarly impact.

Specific altmetric tools

Logo of Altmetric Explorer

Altmetric explorer is a web-based platform that allows anyone at Macquarie University to track, search, and measure the online conversations about their research and that of the University. It tracks mentions in major news sources, blogs, government policy documents, Wikipedia, and social media from July 2011 onwards. 

You can use Altmetric explorer to explore mentions in those sources for:

  • all publications with authors affiliated with the Macquarie University, or
  • for specific authors, or
  • Faculties, Departments, or Schools, or
  • subject areas

An example: 


Chapman, S., Alper, P., Agho, K., & Jones, M. (2006). Australia’s 1996 gun law reforms: faster falls in firearm deaths, firearm suicides, and a decade without mass shootings. Injury Prevention12(6), 365-372. doi:10.1136/ip.2006.013714

While the most important part of an Altmetric report is the qualitative data, it's also useful to put attention in context and see how some research outputs are doing relative to others.
The Altmetric Attention Score for a research output provides an indicator of the amount of attention that it has received.
The score is a weighted count
The score is derived from an automated algorithm, and represents a weighted count of the amount of attention we've picked up for a research output. Why is it weighted? To reflect the relative reach of each type of source. It's easy to imagine that the average newspaper story is more likely to bring attention to the research output than the average tweet. This is reflected in the default weightings:






Sina Weibo



Policy Documents (per source)












Open Syllabus




The Altmetric Attention Score always has to be a whole number. This means that mentions that contribute less than 1 to the score sometimes get rounded up to one. So, if we picked up one Facebook post for a paper, the score would increase by 1, but if we picked up 3 more Facebook posts for that same article, the score would still only increase by 1. 

Logo of Impactstory

Impact Story allows the user to create an online profile that gathers usage data from the many online research-sharing platforms


Speed: can accumulate more quickly than traditional metrics such as citations
Range: can be gathered for many types of research outputs, not just scholarly articles
Macro view: can give a fuller picture of research impact using many indicators, not just citations
Public impact: can measure impact outside the academic world, where people may use but not formally cite research
Sharing: If researchers get credit for a wider range of research outputs, such as data, it could motivate further sharing

Reliability: like any metric, there's a potential for gaming. Also, altmetrics may indicate popularity with the public, but not necessarily quality research
Difficulty: can be difficult to collect, for example bloggers or tweeters may not use unique identifiers for articles
Relevance: there are many different metric providers available and it can be hard to determine which are the most relevant and worth taking time to collect
Acceptance: currently, many funders and institutions use traditional metrics to measure research impact
Context: use of online tools may differ by discipline, geographic region, and over time, making
altmetrics difficult to interpret

Scopus altmetrics

Scopus offers article level social activity metrics that include: 

  • Mendeley Readers 
  • Facebook 
  • Twitter
  • Google +

Mendeley readers

Readership Statistics can be found on the Mendeley Catalogue, on the right hand side of a reference entry. 

The 'X Readers in Mendeley' statistics represent the total number of Mendeley users who have this reference in their Mendeley personal library, the other statistics show the top disciplines, status and country of these users. These statistics are updated on a regular basis.

Want to know more?