Measure correctly or do the right thing? A national perspective on statistics and indicators

When the data for a core business starts declining, the existence of libraries tends to be disputed. But more is perhaps not always better and what has previously been measured at libraries perhaps no longer constitutes a library’s core activities

In one of the Swedish dailies, during the spring of 2010, one could read that evaluation processes had gone “from being icecold to luke-warm”. In other words, from being considered a grey dreary fact of existence to evolving into something useful, at least within state and municipal activities. Indicative of this is that a number of authorities have been established in Sweden during the past 10 years for the sole purpose of monitoring and producing evaluations. 2011 will also see the establishment of a government authority on cultural analysis.

It is not evaluation in general, but the use of indicators in particular which has become popular. A search enquiry in the proposal from the Swedish National Agency for Higher Education for a new national quality assurance system provides more than 30 hits on the word ‘indicator’.What is the reason? Why do we want to measure more?

A long tradition of gathering statistics

We have a long tradition of gathering statistics from the library sector. Both public library statistics and research library statistics have data from the 1950s. These time sequences of data reflect what we traditionally consider to be the core functions of a library: loans, user visits, and collection and reference questions.

What is it we can take into account when applying present scales of measurement regarding core activities of libraries? We have seen that the number of loans, visits and reference questions have declined, as have the collections of printed books. The results are by no means unique to Sweden, but part of the general trend among the Nordic countries and in Europe. The Swedish research libraries’ statistics for loans indicated an upward trend over a period of years, but a closer analysis actually reveals that it is renewals that have increased rather than new loans. New loans have actually declined year by year.

More has often been considered better and libraries have in some cases been pressurized from their heads of organisation to increase number of loans and visitors. When the data for a core business starts declining, the existence of libraries tends to be disputed. But more is perhaps not always better and what  has previously been measured at libraries  perhaps no longer constitutes a library’s core activities. If the case is that other activities have replaced traditional ones,  the question is what those other activities  consist of, what scales of measurements apply and can we compare new activities with old ones.

We are measuring the wrong things

In an interview project during the spring  of 2010 there were questions included about library statistics aimed at heads of  libraries and decision makers at universities. The study was aimed at the research library  directors to encapsulate views concerning  the value and usefulness of the nationa statistics currently produced and in what respects it could be improved. It became quite apparent that the interviewed library directors felt that the national statistics were too traditional. They were not felt to be indicative of activities taking place in libraries today and that the reality of the surveyed libraries was well beyond the traditional scales of statistical measurements which constitute the national statistics. The libraries’ mission, operations, collections, and accessibility have changed. Not least, user patterns and needs have changed.

Statistics are not reliable

It also became apparent through the interview project that library management felt there to be quality issues regarding  the national library statistics, and especially concerning the reliability of statistics. The provision of library statistics appears to be caught  in a vicious circle.Libraries fail to  believe that national library statistics are reliable and are therefore not as accurate when reporting data, and they place even  less faith in their approach when gathering  statistics, etc.

We compare the wrong things

In the nationallibrary statistics comparisons have for instance been made between the number of downloads of ebooks and the number of new circulation statistics of printed books. Comparisons have also been made between the number of physical visits to library facilities and number of visits to libraries’ web sites. Parallels have also been drawn between the increase in tuition and the decline in the number of reference questions – all this in an attempt to reflect a library’s development. This line of reasoning has provoked reactions, and many believe that the comparisons rest on false grounds and that the use of e-resources is not equivalent to loans of printed materials.

We know too little

A reason why comparative statistics are perceived as less appropriate is perhaps that we in general know too little about our users and their behaviour patterns. Studies have been conducted, among other places, in England as to how students use e-books – in Sweden we know too little about how the act of reading occurs and how readers make their way into printed and digital texts.

Quality and quantity

The interview project showed that qualitative measures are needed ratherthan quantitative when communicating with management. Several libraries have  and thereby making a library’s operational activities visible to management. Stockholm University Library has imple- mented a major reorganisation, based on the library’s main processes. This has increased awareness as to how scales of measurements can be applied in various ways. Catherine Ericson- Roos, Head Librarian at the Stockholm University Library, noted in her report that “The processes that can have measurable objectives also have them. This applies to processes such as identifying and placing books on shelves, interlibrary loans, acquisitions and cataloguing. Others are dependent on customer surveys and evaluations of various kinds, such as education and information work.”

That scales of measurements in statistics are perceived as inadequate is perhaps because we are measuring what is easy to measure whilst libraries are currently in a process of change. The discrepancy between what is developed and displayable and the traditional aspects that have always been measured has simply become too large.

Are indicators the solution?

If traditional statistical approaches fail to describe library activities perhaps an indicator can do it. The term indicator is widely used, but what do we put into the notion? The Swedish National Financial Management Authority defines the concept of performance indicator as a “measurement that can be used in assessing the success of stated goals or policies.” The starting point is that there is a targeting system. At this stage, a library that makes use of indicators can run into problems because libraries and their respective councils have not always incorporated libraries in their governing documents. Not all local councils mention libraries in their strategic documents and for instance a special library might find it difficult to even gain a clear mandate from its council.

We have no national library strategy in Sweden today. We do however have the Libraries Act, which is currently subject to revision. Nor do we have national indicators for libraries. The closest attempt was in a project by the Swedish Library Association several years ago. This was when the ISO standard 11 620 Performance indicators for libraries was translated and a number of libraries, both public and research libraries attempted to use indicators during a period of time to see how useful they were and if the indicators could be used as a kind of benchmarking between libraries. If we instead look at the use of indicators among the different library types and at individual libraries, there are at least a couple of examples to highlight.

Public libraries

Each year the Swedish Arts Council gathers and compiles statistics for the public library sector. Subsequently, the Swedish Library Association, on its own initiative, produces an adaptation of the data sorted by county and municipality. In this way the Library Association produces 10 indicators for public libraries:

• Number of visits per capita
• Total loans per capita
• Number of children’s book loans per child 0-14 years
• Number of children’s books per child 0-14 years
• Number of books per capita
• New acquisitions per capita
• AV collections per capita
• Magazine subscriptions per 1,000 inhabitants
• Library employees per 1,000 inhabitants
• Total operating cost per capita.

It has been made apparent in a number of municipalities that the allocation of funds is based solely on some of these indicators.Municipalities do an analysis of library performance based on indicators and much is then understood as being better. Fewer loans or fewer visits can therefore in reality mean a decrease in acquisitions funds. It is the traditional core business, which makes an impact and it is difficult for libraries to show that new activities have been implemented and require resources.

School libraries

The Swedish Arts Council gathers school library statistics. The latest gathering of statistics was done in 2008. Once again, the Library Association processed the material and constructed an indicator for school libraries: The percentage rate of schools with school libraries staffed of a total number of school libraries. Staffing can in this case be either staffing at the school library or that a school library is integrated with a public library or that a school uses a library at another school unit. The indicator is new and has no correlation over time but can safely be used in debates about the existence of school libraries.

Hospital libraries

Hospital libraries have undergone a change. From being solely patient libraries the hospital libraries of today exist both as patient libraries and an important information resource for hospital staff. The gathering of statistics, which the Swedish Arts Council did in 2010, was based on a new survey-form, which was devised and developed to better suit the present-day activities of hospital libraries. In general terms one can say that the questions asked of hospital libraries in this new survey are more related to researc library issues than, as before, public library issues. A new network has been instigated for the heads of hospital library and there is now a solid foundation to proceed with for instance statistics and quality issues.

University and college libraries

The National Library has by tradition supported the Swedish Arts Council in the gathering of statistics for the research libraries. Questions and definitions of terms have largely been based on international standards. The data compiled are used by Urank, which is a free and independent association for the study of and performance of the Swedish university and college rankings. Urank publishes an annual ranking of Swedish universities. This ranking included two indicators related to the library:

• The libraries’ grants from the parent organisation as a share of the university’s total assets

•Acquisition (printed material and electronic resources) divided by the number of students.

Urank have five criteria and within each criterion, there are a number of indicators. The library is a separate criterion; others are students, teachers, education and research. One can therefore say that libraries are well represented even if the indicators reflect a traditional image of the library.

One university library has gone further in using indicators and that is the Mid Sweden University Library who in close collaboration with the university principles has developed a quality index. The aim is to show how library operations work and how well the library meets the goals contained in the university’s strategy plan. The library saw an opportunity to work through the quality index to reach a clearer understanding of how integrated the library’s activities are with other areas within the university.

The present quality index has 16 indicators divided into four areas: research, education, operational fundamentals and growth potential. Among the indicators are the more traditional ones such as “How much of the interlibrary loan volume consists of lending?” In which the interpretation lies in the fact that the library has a good stock if the lending volume is greater than the loans to the library itself, and “ILL: time for book orders” whereby the time span it takes between initiating a request to when the user receives the material should be as short as possible. But there are also indicators which attempt to describe new activities such as visits to library web sites in relation to the number of students and staff at the university. Attempts at reflecting the library’s development potential is set by the indicator “How many of a strategy plan’s objectives have been met throughout the year?” and the library’s developments in collaborating with other organisations and finding external customers have received their own indicators.

Research libraries

The Swedish special research libraries reply to the same questions as those put to university and college libraries. It appears that a number of issues in the national statistics are not relevant to special research libraries. The results are misleading and research libraries are virtually invisible when compared to university and college libraries. In a newly written report, Kerstin Assarsson- Rizzi presents the situation which special research libraries within the humanities find themselves in, and it becomes clear that special research libraries have difficulty maintaining their positions both in relation to their principals as well as other research libraries. Resources are small at special research libraries, while their collections and expertise in many cases are totally unique. In an attempt to better reflect more precisely the uniqueness of the special research libraries the report suggests an indicator for special research libraries: the number of unique titles registered in the union catalogue LIBRIS, compared with the total number of titles registered in the same catalogue for a specific special research library. The interpretation of the indicator is that the larger the share of unique titles in LIBRIS the more important a collection is. With the help of such an indicator a research library can emphasize its unique role to its principal. It is also easier to present a national scenario in which the various special research library collections complement each other and also complement the collections of the National Library. The use of such an indicator would encourage research libraries to an even greater extent to catalogue their unique material in LIBRIS, rather than what is most in demand, as so often is the case today. Another advantage is that the results of the indicator can be entirely machine generated.

The National Library

The National Library has been commissioned by the Department of Education to develop indicators that better describe the results of its operations. For instance, the department has requested suggestions for indicators to present results from its provision. The National Library has taken part in international ventures within the ISO to develop indicators for national libraries and these recommendations have been the basis for its continued internal work.

The National Library’s expert group on library statistics

The responsibility for gathering official library statistics lies now with the Swedish Arts Council. The gathering of data made in January 2010 covering 2009 is the first collection where the public libraries, research libraries and hospital libraries have responded to the questionnaires in the same time period and in which the results have been processed simultaneously. A degree of harmonization of the questions for each library type has been effectuated while a few new issues have emerged.

The efforts to coordinate library statistics are based on the National Library’s expert group on library statistics. The Expert Group’s objective is to represent all types of publicly funded libraries, and to gather all significant players in the field. The group studies future scenarios and works in a long term perspective. During the coming year, the groups of experts have put their time to revising the questionnaire to accommodate the different library types. The starting point are the ISO standards that deal with library statistics and which are also under revision. The idea is that the added areas of questions included in the standards are to be used in the Swedish questionnaire and that all definitions are retrieved from the standards and thus are identical regardless of library type.

There is a parallel, ongoing process in developing a common technical system for managing library statistics. To have a common reporting system in which libraries can also extract data and process it locally is something that information providers and library directors have wanted. There is also an ongoing dialogue with suppliers of library systems.v Even here there is a need for a harmonization of terms used in library systems and definitions used for statistics to facilitate the withdrawal of statistics from the systems. The ideal would of course be that some basic data could be obtained directly from the system and delivered to the national statistics without too much manual intervention from the libraries. It would streamline the work considerably. If the basic data could be extracted from the systems, time could instead be used to develop indicators and to produce thematic analyses.

In November 2009 the National Library was assigned with the task of formulating an overall national strategy plan for publically funded libraries. Their response was a plan that was published in the spring of 2010. The plan describes how the National Library in various ways desires to work towards a national collaboration in the library sector. Working on a national scale with quality aspects and statistics are consistent with the assignment and are well in line with the ongoing work of the Expert Group on Statistics.

We measure – therefore we exist

There is a sense of awareness among libraries that follow-ups of various kinds are needed and that quality work is crucial, among other things, to enable constructive dialogues with those in management who are also responsible for funding. At present there seems to be consensus on a national level that we measure the wrong things in the library sector. The question is what we want to make available with our statistics. Is there a tangible result that can be applied for the purpose of requesting more funds? In this case, perhaps a number of indicators might be a useful tool.Who shall then actually determine the content of an indicator then becomes a crucial point. Who has the power to put together a display and to determine the level of achievement? And what is then connected to the outcome of the indicator? Is it the allocation of funds, number of staff or something else? Are a number of national indicators needed as in Norway? Should we in that case use the international indicators, developed by ISO and which are in the process of modernization? Or should we create our own? Or collaborate with the other Nordic countries?

Are we closer to the truth with regard to the results of library operations if we use the indicators compared to traditional statistical approaches? Is it not rather the case that, in addition to producing better tangible results within the library community, we should also analyze the impact of library activities? The question then is not about how many books are borrowed per capita or per student. The question is rather whether children with access to a school library achieve better educational results? Does the student with access to a university library improve his academic results? Do hospital libraries create better health care? Does access to a public library increase adult education? Does the National Library contribute to the nation’s collective memory?

We measure, therefore we are – but the question is for how long. If we do not analyze the impact of library activities, we will not be able to choose a path and therefore not be able to justify the importance of a library. Of course, it is a problem that we measure the wrong things but the crucial question is: are we doing the right things?

Christine Lindmark
Executive Officer: Statistics and Evaluation
National Library of Sweden
christine.lindmark( AT )kb.se

Translated by Jonathan Pearman

Executive Officer: Statistics and Evaluation National Library of Sweden