Short for alternative metrics, altmetrics are a measure of web-based scholarly interaction. They aim to measure such things as how often research is tweeted, blogged, downloaded or bookmarked.
Their development can be seen as a response to the impact of social networking on the research environment. Use altmetrics to:
Alternative metrics can be used to compensate for some of the limitations of traditional research metrics as well as presenting a more well-rounded picture of research impact.
They should be used as a supplement to traditional metrics. Alternative measures can assist:
See the Altmetrics: a manifesto, posted on altmetrics.org, for further explanation about the development of this approach to measuring scholarly impact.
Altmetric explorer is a web-based platform that allows anyone at Macquarie University to track, search, and measure the online conversations about their research and that of the University. It tracks mentions in major news sources, blogs, government policy documents, Wikipedia, and social media from July 2011 onwards.
You can use Altmetric explorer to explore mentions in those sources for:
An example:
Chapman, S., Alper, P., Agho, K., & Jones, M. (2006). Australia’s 1996 gun law reforms: faster falls in firearm deaths, firearm suicides, and a decade without mass shootings. Injury Prevention, 12(6), 365-372. doi:10.1136/ip.2006.013714
While the most important part of an Altmetric report is the qualitative data, it's also useful to put attention in context and see how some research outputs are doing relative to others.
The Altmetric Attention Score for a research output provides an indicator of the amount of attention that it has received.
The score is a weighted count
The score is derived from an automated algorithm, and represents a weighted count of the amount of attention we've picked up for a research output. Why is it weighted? To reflect the relative reach of each type of source. It's easy to imagine that the average newspaper story is more likely to bring attention to the research output than the average tweet. This is reflected in the default weightings:
News |
8 |
Blogs |
5 |
|
1 |
|
0.25 |
Sina Weibo |
1 |
Wikipedia |
3 |
Policy Documents (per source) |
3 |
Q&A |
0.25 |
F1000/Publons/Pubpeer |
1 |
YouTube |
0.25 |
Reddit/Pinterest |
0.25 |
|
0.5 |
Open Syllabus |
1 |
Google+ |
1 |
The Altmetric Attention Score always has to be a whole number. This means that mentions that contribute less than 1 to the score sometimes get rounded up to one. So, if we picked up one Facebook post for a paper, the score would increase by 1, but if we picked up 3 more Facebook posts for that same article, the score would still only increase by 1.
Impact Story allows the user to create an online profile that gathers usage data from the many online research-sharing platforms
Speed: can accumulate more quickly than traditional metrics such as citations
Range: can be gathered for many types of research outputs, not just scholarly articles
Macro view: can give a fuller picture of research impact using many indicators, not just citations
Public impact: can measure impact outside the academic world, where people may use but not formally cite research
Sharing: If researchers get credit for a wider range of research outputs, such as data, it could motivate further sharing
Reliability: like any metric, there's a potential for gaming. Also, altmetrics may indicate popularity with the public, but not necessarily quality research
Difficulty: can be difficult to collect, for example bloggers or tweeters may not use unique identifiers for articles
Relevance: there are many different metric providers available and it can be hard to determine which are the most relevant and worth taking time to collect
Acceptance: currently, many funders and institutions use traditional metrics to measure research impact
Context: use of online tools may differ by discipline, geographic region, and over time, making altmetrics difficult to interpret
Scopus offers article level social activity metrics that include:
Readership Statistics can be found on the Mendeley Catalogue, on the right hand side of a reference entry.
The 'X Readers in Mendeley' statistics represent the total number of Mendeley users who have this reference in their Mendeley personal library, the other statistics show the top disciplines, status and country of these users. These statistics are updated on a regular basis.