|Editors: Ann P. Dougherty, Mountainside Publishing; Richard M. Dougherty, University of Michigan, Emeritus
Contributing Editors: Mignon Adams, University of the Sciences in Philadelphia; Larry Hardesty, Austin College; William Miller, Florida Atlantic University; Maureen Pastine, Temple University
|Vol. 23, No. 4||March 2003|
Meaningful Measures for Libraries
by Carla Stoffle and Shelley Phipps
Academic libraries do not exist in isolation from their institutions. Escalating economic pressures and demands for accountability from state governments, accrediting associations, and governing boards are changing how institutions of higher education operate and measure their success. These demands impact library operations and measures of success as well.
It has become important for a library to be able to assess the value it adds to organizationally identified outcomes. These outcomes include how the library makes a difference to its constituents, how it compares to similar institutions, and what it contributes to the teaching and learning of students. How does it do that in an environment where traditional measures have been input and output focused?
The ARL New Measures Initiative
In January 1999, members of the Association for Research Libraries (ARL) began its “New Measures Initiative“ to answer these questions. The initiative is based on the belief that, by working together, testing a series of small pilot projects, academic libraries can develop assessment tools that will be broadly applicable across all libraries. Within this tool set will be
Measures that demonstrate how the library contributes to institutional outcomes in learning and research, and
Measures that help determine library performance and cost effectiveness across a broad span of services and functions (i.e., provide data that allow academic libraries to benchmark best practices with other institutions).
The New Measures Initiative aims to develop performance measures and assessment instruments that can be used by all academic libraries and has benefited from collaboration with the Association of College and Research Libraries (ACRL). Data collection across libraries will provide local institutions with their own and comparable data to benchmark best practices and implement improvements. Continuous improvement based on users’ needs and innovative peer practice will lead to the collective success of libraries in these economically difficult and technologically challenging times.
LibQUAL+™ — Focus on Outcomes for Users. After three years of research and development, under the direction of researchers from Texas A & M, LibQUAL+™ is being used by 325 academic and other libraries and is about to be implemented in European countries. It is a web-based evaluation tool that measures library users’ perceptions of service quality based on gaps between desired, perceived, and minimum expectations of service. It provides reliable, comparable data for benchmarking purposes. Participating libraries send web-surveys to a representative sample of users via e-mail, inviting them to give feedback on library services. The survey measures service quality in five areas: Service Affect, Library as Place, Reliability, Personal Control, and Information Access (Spring 2002). The survey data is centrally analyzed and reports are generated for each participating institution. Participants can track their progress and improvement over time, as well as compare their service with that of other institutions in an effort to establish quality benchmarks.
E-Metrics — Understanding the Usage of Electronic Collections. LibQUAL+™ feedback indicated that users prefer access to information from their desktop and that the availability of electronic texts increases annually. It is increasingly important, therefore, to collect data that will measure the impact of choosing this strategy to provide access to information. In February of 2000, a core group of 24 ARL libraries began the E-Metrics Project, in which they sought to:
Identify key statistics needed to evaluate effectively “those electronic information resources and/or services that users access electronically via a computer network,”
Gather data through participant survey questionnaires (of both libraries and vendors) and site visits, and
Evaluate the data and refine the collection process for future and ongoing evaluations.
Data collected included statistics on the number of electronic resources available, patron usage of those resources, expenditures on networked resources and related infrastructure, library digitization activities, and comparison of percentages of electronic materials and transactions to overall library materials and usages. The lack of vendor standards that would allow for uniform collection of data has proved problematic in the E-Metrics Project. A different project, Project COUNTER, aims to establish an internationally accepted, extendible Code of Practice to define usage statistics, consistent, credible and compatible with evaluations as described above, while ARL libraries continue to experiment.
Learning Outcomes—Measuring the Results of Educational Efforts. The influence of technology on educational institutions has changed perspectives on teaching and learning models—from input/output to a more complex web of interaction. Universities’ and colleges’ accreditation and reputations are no longer being measured by the expertise of the faculty and sophistication of the physical environment.
The focus has shifted to what students know and the skills they carry with them into the workforce. Among these skills are the ability to maneuver through the intricate world of information and to use technology effectively. Academic departments can no longer quantify their success by test results, but must consider the larger context of outcomes. They need to demonstrate how effectively students process information using critical, creative and collaborative thinking. Libraries are an increasingly crucial partner in teaching students skills that achieve these new departmental goals.
One of these skills, information literacy, is a particularly significant mission of academic libraries given the complexity of information systems and the myriad of informaiton now available in electron format. Information literacy, as defined by the ACRL, is a set of abilities requiring individuals to “recognize when information is needed and have the ability to locate, evaluate, and use effectively the needed information.”
One assessment program in development at Kent State University is aimed at creating tools to measure the effectiveness and reach of library teaching programs in undergraduate education. The Project for Standardized Assessment of Information Literacy Skills (SAILS) intends to develop a compilation of questions that can be used randomly to measure students’ information literacy skills upon entering the university and at graduation. It will also assess the impact of those skills upon students’ academic careers.
The SAILS project aims to develop measuring systems that are easily administered, are proven valid and reliable, assess at an institutional level, provide for both external and internal benchmarking and contain items not specific to a particular institution or library. Utilization of this assessment tool will enable libraries to not only measure their own effectiveness, but also to compare their success against national standards.
Process Performance Effectiveness. In these economic times, institutions are pressured to demonstrate cost-efficiency as well as return on investment. In order for libraries to know whether their own performance and related costs are comparable to that of peers, benchmarks are needed that provide cost information. Several projects are underway that establish “mileposts” by which to measure progress and improvement.
The Technical Services Cost Study by five universities across the country is one such project. Staff time on tasks performed within different cost centers is being tracked over the course of a year (sampling every four to six weeks). Processes include: acquisitions, cataloging, catalog maintenance, volume preparation and preservation, conversion, paid leave, automation and support services. The data gathered is available for analysis by employee type, time parameters, non-staff costs (vendors, systems, utilities), and other measures. Software is being developed and marketed to enable all libraries to input their own data and analyze it against the collective results.
Economic pressures have also forced more libraries to rely on inter-Library Loan services to meet users’ needs. Another project under way is the Inter-Library Loan/Document Delivery (ILL/DD) Performance Measures Study. In 1997, ARL conducted a study to obtain current data on the performance of mediated and user-initiated ILL/DD operations in research and academic libraries. This project tracked costs involved in borrowing and lending, as well as user satisfaction for 97 research and 22 college libraries.1 The findings were published and had considerable impact across libraries. In 2002, with new technology available for these processes, ARL is initiating a follow-up open to all types of libraries.
Measuring Library Impact on Research. As libraries develop digital collections, it is important to keep track of the impact on faculty research. By providing electronic resources such as internet portals and e-journals to individual desktops, researchers are able to access a greater amount of materials without having to enter a library building.
While it is difficult to quantify the library’s impact on the amount and quality of research at any given institution, some advances have been made in assessing the number of library resources used by researchers (as opposed to instruction or other institutional activities).
Brinley Franklin, Director of University Libraries at the University of Connecticut, has created a methodology to measure the actual use that funded-researchers make of a library’s electronic services, print collections, traditional services, and facilities. Using this methodology, libraries can construct an accurate picture of the number of resources that support sponsored-research activities. This not only helps an institution recover indirect costs associated with grants and awards, it provides data about the commitment to sponsored research activities within a library’s budget priorities.
See www.arl.org/stats/ for more about the ARL initiatives.
Creating a Culture of Assessment: The University of Arizona Experience
In addition to the national efforts that have set a new direction for performance measures in academic research libraries, individual institutions are experimenting with using assessment techniques to quickly learn about changes in local users’ expectations. The importance of implementing an organizational performance measurement system has been one conclusion of such experiments. One institution, the University of Arizona Library, created a Performance Effectiveness Management System (PEMS) in 1998.
PEMS is designed to align individual, unit, and whole organization efforts with strategic goals set by the Library and to provide measures that indicate success, progress, and the need for improvement. Developing a system approach to measurement helps develop an internal culture of assessment where decisions are guided by facts, research and analysis, and where services are planned and delivered to maximize positive outcomes for customers and stakeholders.
Staff are encouraged:
To care about what results they produce,
To value the actual impact they have on the educational or research process, and
To know how these results relate to user expectations.
Creating this culture can ensure that there is a commitment to continuing assessment, development of new tools, and use of data for continuous improvement. Without an emphasis on culture change, barriers will remain to the full acceptance of the use of performance measures.
PEMS: How does it work?
The PEMS system requires units to continuously assess needs of their assigned faculty and student groups. Using varying techniques, teams engage in discovering what is most important to their customer groups about the services they provide. They use this information to formulate standards or performance targets for the particular service activity. They gather data on progress toward these standards and report periodically to the Library Cabinet and Strategic Planning Team.
After analyzing the needs and assessing where improvement can make a difference, the teams either engage in specific projects to increase their effectiveness or assign individuals the responsibility for increasing the amount or quality of their individual work that results in outcomes for students and faculty. Thus the system supports individual goal setting that furthers the unit’s service effectiveness related to what customers have identified as important. Strategic, future focus is structured into the system as the teams start the year’s planning by understanding their role in achieving the Library’s 3 to 5-year strategic goals.
As teams create their strategic framework for the year, they set outcome, output and quality measures to assess success. In some cases outcomes are measured directly with customers. In other cases, needs assessments lead to the creation of output or quality measures that can be deduced from the identified need. For example, if a team assesses that only a limited portion of its potential user group is using its services, they may set a standard to increase the number of instructional sessions or increase the number of students reached. If they recognize that limited budgets require selection of materials that have a high potential for usage, they may set a standard that measures actual usage of material purchased. If they identify that timeliness is an important aspect of the service they give, they may set a standard for turn-around time, from request to delivery. If they identify that accuracy of information is critical to the users of their service they may set a related standard.
Examples. Examples of outcome measures may include:
80 percent of science and engineering information needed by SET customers is provided by the Library.
95 percent of the serials acquired in science/engineering areas are used at least once each year.
75 percent increase per year of students who successfully demonstrate learning outcomes.
90 percent of users who respond to a survey express satisfaction with the digital product they used.
Some standards use increased output as surrogate measures for an intended outcome. For example:
Increase by at least three the number of partnerships that impact course content to include information literacy components, with faculty members not previously involved in instructional partnerships (the intended outcome is the increased capability of the students reached to select and evaluate information sources).
95 percent of UA faculty will receive information about intellectual property rights and copyright (the intended outcome is that faculty is able to utilize this information in their teaching and publication efforts).
Some standards relate to the user’s need for access or service in a timely fashion or a need for accuracy that will facilitate access:
95 percent of material (all formats) will be shelved within 20 hours at each site with 97% accuracy.
90 percent of traditional reserve requests at all sites will be available to customers within 48 hours after receiving the request.
Assessment with users has led to process improvement efforts that have decreased cycle time, increased amount and quality of service, and saved hundreds of thousands of dollars that have been reallocated to the purchase and refresh of technology and the implementation of a competitive salary structure.
Learning the tools and methods for assessing performance has been a challenge. The ARL New Measures Initiative has afforded the opportunity to use more reliable measurement methods that will also yield peer benchmarking data. The Library’s Strategic Plan now incorporates measures that can be derived from the LibQUAL+™ instrument as well as measures based on team data gathering. For example:
Degree to which access to electronic information exceeds customers’ minimum expectations (from LibQUAL+™): 15 percent increase within 5 years.
Scholarly electronic resources newly developed by the Library and available remotely to customers (120 new resources within the next 5 years).
Using Data for Strategic Decision Making
Information from the first pilot years of LibQUAL+™ indicated that users desired delivery of electronic information to the desktop and that the Library was not reaching even minimum expectations. A strategic decision was made to create a Document Delivery Team and set standards for electronic delivery of interlibrary-loaned material and reserve articles. Cost information from the ILL/DD project was used to research best practices and to join with other libraries using more efficient technology. This enabled customers to order interlibrary loans directly from other libraries which reduced turn-around time. A systems analysis project team was formed to research the capability of present systems to provide the infrastructure necessary to improve access in the future. As a result, the UA Library is now participating in the development of the “Scholar’s Portal” software with a number of ARL Library partners.
Access to electronic information is indicated as highly desired by LibQUAL+™ respondents and confirmed by team interviews and surveys. This information also contributed to the setting of the standards in the Strategic Long Range Plan and influenced the UA Library’s decision to invest a larger portion of its information access budget in electronic information resources.
Evaluation of the University’s budgetary situation has led to plans to increase consortial purchases as a way of reducing overall costs to the University and a standard was set to save at least $100,000 per year on such purchases. The UA Library will be joining the SAILS project to research more efficient ways of measuring learning outcomes. As the Library builds its new technological infrastructure, we will be using the research from the E-Metrics project to assess the effectiveness of our electronic access systems—from the customer point of view and in comparison with peers involved in this national effort.
Fulfilling our Educational Mission
The PEMSystem and its ability to use the results of national initiatives such as the “New Measures Initiatives,” enables the Library to demonstrate to the campus that all resources and staff efforts will be focused on the changing expectations of our users by providing value-added services. Incorporating efficiency measures and developing process improvement projects also demonstrates the Library’s commitment to maximizing return on investment. Accountability is demonstrated at the institution, unit and individual level.
Use of performance measurement is a way of ensuring that academic libraries retain their ability to perform their special mission within the educational process. These include:
Valuing freedom of access to information,
Providing equitable access to all levels of users, and
Increasing the information literacy of the students and faculty in this increasingly complex scholarly communication system.
Developing systems and using evolving tools created by ARL-sponsored projects and others, increases the ability to measure and communicate performance outcomes and the effect of libraries on the quality of teaching and learning processes in our institutions.—Carla Stoffle, email@example.com is Director of Libraries and Shelley Phipps, firstname.lastname@example.org is Assistant Dean for Team and Organization Development at the University of Arizona.
Julia Blixrud. “The Association of Research Libraries Statistics and Measurement Program: From Descriptive Data to Performance Measures,” 4th Northumbria International Conference on Performance Measurement in Libraries and Information Service (Washington, DC: Association of Research Libraries, 2002).
Colleen Cook. “Zones of Tolerance” in Perceptions of Library Service Quality: A LibQUAL+“ Study
portal: Libraries and the Academy - Volume 3, Number 1, January 2003 - Article
Brinley Franklin. “Academic Research Library Support of Sponsored Research in the United States,” 4th Northumbria International Conference on Performance Measurement in Libraries and Information Service (Washington, DC: Association of Research Libraries, 2002).
Fred Heath, Colleen Cook, Martha Kyrillidou, and Duane Webster. “The Forging of Consensus: A Methodological Approach to Service Quality Assessment in Research Libraries – the LibQUAL+Experience,” 4th Northumbria International Conference on Performance Measurement in Libraries and Information Service (Washington, DC: Association of Research Libraries, 2002).
Martha Kyrillidou. “From Input and Output Measures to Quality and Outcome Measures, or, From the user in the life of the library to the library in the life of the user,” Journal of Academic Librarianship, volume 28, number 1: 42-46. http://www.arl.org/stats/arlstat/jal101.html
Martha Kyrillidou. “To Describe and Measure the Performance of North American Research Libraries,” IFLA Journal, summer 2001. http://www.arl.org/stats/arlstat/ifla01.html
Rush Miller, and Sherrie Schmidt. “E-Metrics: Measures for Electronic Resources,” 4th Northumbria International Conference on Performance Measurement in Libraries and Information Service (Washington, DC: Association of Research Libraries, 2002).
Shelley E. Phipps. “Performance Measurement as a Methodology for Assessing Team and Individual Performance: The University of Arizona Library Experience,” Proceedings of the 3rd Northumbria International Conference on Performance Measurement in Libraries and Information Services, August 27-31, 1999: 113-117. http://www.library.arizona.edu/library/teams/fast/biblio.html
Kenneth R. Smith. “New Roles and Responsibilities for the University Library: Advancing Student Learning Through Outcomes Assessment,” Association of Research Libraries, 2000. http://www.arl.org/stats/newmeas/HEOSmith.html
“Jeff” Shim Wonsik, Charles McClure, and John Bertot. “Preliminary Statistics and Measures for ARL Libraries to Describe Electronic Resources and Service,” 4th Northumbria International Conference on Performance Measurement in Libraries and Information Services (Washington, DC: Association of Research Libraries, 2002).
Library Issues: Briefings for Faculty and Administrators (ISSN 0734-3035) is published bimonthly beginning September 1980 by Mountainside Publishing Co., Inc., 321 S. Main St., #213, Ann Arbor, MI 48104; (734) 662-3925. Library Issues, Vol. 23, no. 4. © 2003 by Mountainside Publishing Co., Inc. Subscriptions: $80/one year; $140/two years. Additional subscriptions to same address $25 each/year. Address all correspondence to Library Issues, P.O. Box 8330, Ann Arbor, MI 48107. (Fax: 734-662-4450; E-mail: apdougherty@CompuServe.com) Subscribers have permission to photocopy articles free of charge for distribution on their own campus. Library Issues is available online with a password at http://www.libraryissues.com
last modified: March 2003
Produced by Mountainside Publishing Co., Inc.