ALCTS hosted an e-forum about Analytics for Electronic Resources on December 9-10, 2014. An archive of the conversation can be found at http://lists.ala.org/sympa/info/alcts-eforum, and a summary created by the organizers (Jennifer Bazeley, Interim Head, Technical Services, Miami University and George Stachokas, Special Assistant to the Dean for Project Management and Assistant Professor of Library Science, Purdue University Libraries) is provided below.
Day 1 Summary:
-COUNTER and SUSHI standards: specifically the good versus the bad, version 3 versus version 4, compliance issues in the publisher community, which reports are commonly compiled, how we utilize them, and pitfalls. Issues with discovery services inflating usage statistics in database reports; sessions and searches versus record clicks and result views.
-Software for usage statistics: third party vendor software from EBSCO and Serials Solutions, use of Excel, CORAL.
-Differences between analyzing online encyclopedias as databases versus e-books.
-Evaluating usage stats and cost data in context–other considerations in making decisions about e-resources. Information such as number of papers authored or cited by faculty, impact metrics, license terms, library patron/faculty input.
-Other types of usage stats for evaluating e-resources: Google Analytics, link resolver stats, EZproxy stats.
-How does one define use? The question at the center of everything–are the metrics that we’re receiving valuable? Do they reflect meaningful patron use? Questions that may have different answers depending on the type and size of library where one works.
Day 2 Summary:
Leveraging Systems and Tools
-Limited functionality of specialized analytics tools provided by major vendors often leads to continuing reliance on other tools and manual work.
-Libraries, especially smaller institutions, sometimes find the use of specialized tools to be of limited value due to high cost, greater effort required for data entry and maintenance, and limited return on investment.
Making Improvements in Systems and Tools for Analytics
-Systems generally fail to meet current standards; new technology is not welcome if it does not address current problems.
-A call was made for simplicity and uniformity in systems tools; as well as interoperability for tools provided by different vendors.
-Libraries should be careful not to waste time and effort on over analyzing collections that must be retained due to accreditation or other academic requirements.
-JUSP was cited as an interesting alternative due to its ease of use, free availability to libraries (in the UK), and adherence to widely applicable standards.
-A suggestion was made about possibly working with vendors to develop some new simplified analytics tools specifically for libraries.
Organizational Structures and Personnel
-Libraries often combine gathering usage statistics with other professional roles or assign this work to support staff positions.
-Some libraries are developing new collections strategist positions that conduct more complex collection analysis based on a wider variety of analytics data.
-Pay-per-view and other on demand services could make article level statistics more useful to many libraries than journal level statistics.
Upcoming Session at ALA Midwinter:
We would also like to invite you to join us at ALA Midwinter in Chicago for the next meeting of the Collection Management and Electronic Resources Interest Group (CMERIG).
Title: Resource Sharing of Electronic Resources: Problems Opportunities, and Alternatives.
Time and location: Sun, Feb 1, 3:00-4:00 pm, MCP-W176c.
We will discuss how resource sharing of electronic resources differs from interlibrary loan services based on physical materials. Changes in technology, license terms, business models, and best practices for electronic resources management require that libraries reinvestigate how to provide the best possible information services to the contemporary user. What are the implications of these changes for libraries in terms of administration, organizational structure, and professional specialization, as well as routine business processes and workflows?
Later, we will invite libraries to share case studies of how they have improved resource sharing operations at their own institutions during our meeting at the ALA Annual conference.
Sources Referenced in the Discussion:
Impact of Library Discovery Technologies: A report for UKSG
Valerie Spezi, Claire Creaser, Ann O’Brien, and Angela Conyers, November 2013
Data Driven Collection Assessment using a Serial Decision Database
Diane Carroll, Joel Cummings
Serials Review, 36.4 (2010): 227-239
Understanding the Data Around Us: Gathering and Analyzing Usage Data
NISO Forum, November 2007
Measures of Health Sciences Journal Use: A Comparison of Vendor, Link-resolver, and Local Citation Statistics
Sandra L. DeGroote, Deborah D. Blecic, and Kristin E. Martin.
Journal of the Medical Library Association, 101:2 (2013): 110-119
Maximizing Google Analytics: Six High Impact Practices
Tabatha Farney, Nina McHale
Library Technology Report, 49:4 (2013)
Web Analytics in the Library
Library Technology Report, 47:5 (2011)
The Concept of Electronic Resource Usage and Libraries
Rachel A. Fleming-May and Jill E. Grogg
Library Technology Report, 46:6 (2010)
Garbage In, Gospel Out: Twelve Reasons Why Librarians Should Not Accept Cost-Per-Download Figures At Face Value.
Serials Librarian, 63.2 (2012): 192-212
Gathering, Evaluating, and Communicating Statistical Usage Information for Electronic Resources
Managing Electronic Resources: A LITA Guide (2012): 87-120
Perceptions of Value And Value Beyond Perceptions: Measuring The Quality And Value Of Journal Article Readings
Carol Tenopir and Donald W. King
Serials 20.3 (2007): 199-207
Technical Services Transparency
Jennifer W. Bazeley and Becky Yoose
Library Resources & Technical Services 57.2 (2013): 118-127
Journal Usage Statistics Portal (JUSP): http://jusp.mimas.ac.uk/
PIRUS Code of Practice (2013): http://www.projectcounter.org/pirus.html