Altmetrics (a non-traditional assessment of citation-impact metrics in scholarly publishing) have passed their fifth birthday. While there are debates taking place about important issues (the potential gaming of altmetrics, how to weight the various events, etc.), they are helping to engage readers and drive article downloads. Who hasn’t found themselves clicking on a most shared/commented article list?
But can altmetrics help long-term to increase the recognition of the most engaged articles and possibly affect research and practice?
Altmetrics: A Quick History
Peer reviewed articles may have a major effect on current research and practice. How can this impact be calculated? Traditionally, a journal was measured by such items as its Impact Factor (as determined by Thomson Reuters), peer review, and citation counts such as h-index.
However, in 2010 a chorus of voices brought forth the idea of alternative metrics (altmetrics) to augment the existing measures for an article. Altmetrics are non-traditional, article-level metrics that can include journal comments, blog mentions, Wikipedia mentions, Tweets, Facebook posts, Mendeley and CiteULike social bookmarkings, and many other items. Aside from articles, altmetrics can also measure videos, individuals, books, journals, and a host of other content types.
Five years later, altmetrics have grown and are now embraced by most major publishers and journals (Elsevier, Wolters Kluwer, Nature Publishing Group, Public Library of Science, and many others). With 8,000 plus journals using altmetrics, they are quickly becoming expected rather than a perk.
They supplement Impact Factor, pageviews, peer review, and the like in determining how articles are measured. A yearly conference has blossomed and the National Information Standards Organization is seeking public comments on the definitions and use cases for this maturing area. Undoubtedly, recognized standards would benefit all stakeholders.
Companies such as Altmetric.com, Plum Analytics, Impactstory, Kudos, and others have been successfully providing altmetrics services to authors, journals, publishers, and institutions for years.
The Debate Continues
While altmetrics are mainstream and widely used, several concerns or questions remain in the scholarly publishing space.
The gaming of altmetrics is one of these issues. With relatively little effort, an article or paper can show a big gain that may not correlate to its quality. Bots can make the value of the metrics even murkier.
Weighting of events such as Tweets, Wikipedia mentions, and Mendeley bookmarks is another issue. When weighting is assigned by an entity or a user (are blog mentions worth more than a Facebook post?), this brings a subjective element into the mix that can muddy the water.
Does altmetrics help determine the impact of the research? They likely augment (or can contradict) traditional metrics such as Impact Factor and article downloads. Advocates for altmetrics point out that these metrics show engagement rather than scientific accomplishment. Cassidy R. Sugimoto recently said, “Altmetrics needs a far greater protocol and greater validity. Altmetrics have been around for five years, but still only looks at the impact of publications. Nothing has been made visible that wasn’t before.”
For those articles behind a paywall (as opposed to open access), the results of any altmetrics will inevitably be lower. How will their results compare to open access articles in the same field?
Finally, some in the publishing world also question the very term altmetrics and whether it is already passé. What it measures is so mainstream now that instead of alternative “social media metrics” might be a better term.
Diving In Is the Only Option
Warts aside, altmetrics are here to stay and are only growing stronger. Their tenth birthday will find altmetrics even more mainstream and less alt.
Readers do engage with altmetrics and their use at a journal’s website give publishers a chance to highlight articles that people are interacting with most frequently. All publishers should have a Top 10 or Top 100 list of the most engaging articles. Aside from posting on the publication website, these lists can be provided in an email to readers or subscribers that will certainly drive clicks. The email might also be an additional sponsorship/advertising opportunity.
A badge for the top articles on the contents page or article page will also provide pageviews and likely drive more Tweets and the like in an exponential fashion. No definitive study has shown a correlation between altmetrics use and increased traffic, but this may just be a matter of time.
Using a service such as Altmetric.com, makes sense since they provide an out-of-the-box solution that help make metrics available to all readers. It is relatively easy to get started.
Of equal importance is the use of altmetrics to provide evidence on the broad impact and value of the research by an individual or an institution. Such groups as the National Science Foundation, the Gates Foundation, and others will be looking toward these metrics to provide evidence of impact.
Altmetrics reflect the speed of life online, when compared to some other long-term metrics. They enable an author, publisher, or institution to get a more immediate read on how research may be playing out in the broader world or with the mainstream press.
Whether already deployed or under consideration, publishers and other institutions should experiment with altmetrics. Interesting opportunities will emerge. Altmetric.com posts their Top 100 articles for the year. Any such list will give business development and marketing folks a reason to pause and consider opportunities.
This type of list can also get readers talking about what belongs there and what is the odd duck, and that is a great conversation to foster.
What’s the Future Hold for Altmetrics?
Altmetrics (a non-traditional assessment of citation-impact metrics in scholarly publishing) have passed their fifth birthday. While there are debates taking place about important issues (the potential gaming of altmetrics, how to weight the various events, etc.), they are helping to engage readers and drive article downloads. Who hasn’t found themselves clicking on a most shared/commented article list?
But can altmetrics help long-term to increase the recognition of the most engaged articles and possibly affect research and practice?
Altmetrics: A Quick History
Peer reviewed articles may have a major effect on current research and practice. How can this impact be calculated? Traditionally, a journal was measured by such items as its Impact Factor (as determined by Thomson Reuters), peer review, and citation counts such as h-index.
However, in 2010 a chorus of voices brought forth the idea of alternative metrics (altmetrics) to augment the existing measures for an article. Altmetrics are non-traditional, article-level metrics that can include journal comments, blog mentions, Wikipedia mentions, Tweets, Facebook posts, Mendeley and CiteULike social bookmarkings, and many other items. Aside from articles, altmetrics can also measure videos, individuals, books, journals, and a host of other content types.
Five years later, altmetrics have grown and are now embraced by most major publishers and journals (Elsevier, Wolters Kluwer, Nature Publishing Group, Public Library of Science, and many others). With 8,000 plus journals using altmetrics, they are quickly becoming expected rather than a perk.
They supplement Impact Factor, pageviews, peer review, and the like in determining how articles are measured. A yearly conference has blossomed and the National Information Standards Organization is seeking public comments on the definitions and use cases for this maturing area. Undoubtedly, recognized standards would benefit all stakeholders.
Companies such as Altmetric.com, Plum Analytics, Impactstory, Kudos, and others have been successfully providing altmetrics services to authors, journals, publishers, and institutions for years.
The Debate Continues
While altmetrics are mainstream and widely used, several concerns or questions remain in the scholarly publishing space.
The gaming of altmetrics is one of these issues. With relatively little effort, an article or paper can show a big gain that may not correlate to its quality. Bots can make the value of the metrics even murkier.
Weighting of events such as Tweets, Wikipedia mentions, and Mendeley bookmarks is another issue. When weighting is assigned by an entity or a user (are blog mentions worth more than a Facebook post?), this brings a subjective element into the mix that can muddy the water.
Does altmetrics help determine the impact of the research? They likely augment (or can contradict) traditional metrics such as Impact Factor and article downloads. Advocates for altmetrics point out that these metrics show engagement rather than scientific accomplishment. Cassidy R. Sugimoto recently said, “Altmetrics needs a far greater protocol and greater validity. Altmetrics have been around for five years, but still only looks at the impact of publications. Nothing has been made visible that wasn’t before.”
For those articles behind a paywall (as opposed to open access), the results of any altmetrics will inevitably be lower. How will their results compare to open access articles in the same field?
Finally, some in the publishing world also question the very term altmetrics and whether it is already passé. What it measures is so mainstream now that instead of alternative “social media metrics” might be a better term.
Diving In Is the Only Option
Warts aside, altmetrics are here to stay and are only growing stronger. Their tenth birthday will find altmetrics even more mainstream and less alt.
Readers do engage with altmetrics and their use at a journal’s website give publishers a chance to highlight articles that people are interacting with most frequently. All publishers should have a Top 10 or Top 100 list of the most engaging articles. Aside from posting on the publication website, these lists can be provided in an email to readers or subscribers that will certainly drive clicks. The email might also be an additional sponsorship/advertising opportunity.
A badge for the top articles on the contents page or article page will also provide pageviews and likely drive more Tweets and the like in an exponential fashion. No definitive study has shown a correlation between altmetrics use and increased traffic, but this may just be a matter of time.
Using a service such as Altmetric.com, makes sense since they provide an out-of-the-box solution that help make metrics available to all readers. It is relatively easy to get started.
Of equal importance is the use of altmetrics to provide evidence on the broad impact and value of the research by an individual or an institution. Such groups as the National Science Foundation, the Gates Foundation, and others will be looking toward these metrics to provide evidence of impact.
Altmetrics reflect the speed of life online, when compared to some other long-term metrics. They enable an author, publisher, or institution to get a more immediate read on how research may be playing out in the broader world or with the mainstream press.
Whether already deployed or under consideration, publishers and other institutions should experiment with altmetrics. Interesting opportunities will emerge. Altmetric.com posts their Top 100 articles for the year. Any such list will give business development and marketing folks a reason to pause and consider opportunities.
This type of list can also get readers talking about what belongs there and what is the odd duck, and that is a great conversation to foster.
John Bond is a publishing consultant with 25-plus years’ experience in scholarly publishing. Previously he was the chief content officer for SLACK Incorporated, an STM publisher in New Jersey.