Bill Overton will never forget moving the Morris Library at Southern Illinois University in Carbondale. As president and CEO of Overton & Associates, Westminster, MD, he has spent the past 27 years relocating libraries, but this was his biggest job, at 1.6 million volumes.
Jamie LaRue, an erstwhile public librarian (recently turned consultant) in Colorado who has done some cool things (such as negotiating directly with publishers for ebooks while refusing to pay crazy amounts for popular titles), has thought-provoking things to say about the dynamics of change in libraries. Reflecting on a discussion at the Arizona Library Association where something he said apparently raised eyebrows, he expanded on his remarks in a blog post, taking particular aim at a pattern he sees (and many of us will recognize) in library organizations. A decision is made, a direction taken, and then the sabotage begins, conducted by people who contributed little to the discussion as the decision was being made.
Here’s an issue about which I’ve been hearing from colleagues quite a lot lately—that of libraries undertaking and carrying out assessment methods and then ignoring or “trumping” the findings by doing what they wanted to do in the first place, but putting a “check mark” next to assessment in their mental (or literal) to do lists, indicating, “yep, did that!” My thought in such cases is: well, no, you didn’t do that!
The constellation of Star Libraries changes dramatically from year to year. As it does every year, the 2013 Star Libraries illustrates that each annual round introduces a substantial set of new Star Libraries, sees the fortunes of continuing Star Libraries change—as libraries change peer groups and gain and lose stars—and, indeed, sees many of the previous year’s honorees lose their Star Library status altogether. The explanations for these changes are varied and complex. Whether a public library gains or loses Star Library status or sees that status change more subtly is determined as much by the fortunes of other libraries in a library’s spending peer group as by the per capita service output of its own institution. In this year’s article, we will highlight the new Star Libraries that were not on the 2012 list, Star Libraries that maintained their star status despite changing spending peer groups, Star Libraries that gained or lost stars from 2012 to 2013, and libraries that lost Star Library status in 2013.
A major strength of the annual Star Library ratings is that while some public libraries have various kinds of built-in advantages that tend to keep them on the list, in fact, a substantial proportion of the Star Libraries are new to this recognition each year. Of 2013’s 263 Star Libraries, 67 (25.5 percent) were not Star Libraries in 2012. Notably, this year’s percentage of new Star Libraries is higher than it has been in four of the last five years (those four years, ranging from 19.4 percent in fall 2009 to 24.4 percent in 2010). So, generally, the trend over time is increasing annually the percentage of new Star Libraries. Attaining the status of a new Star Library is also becoming more competitive, as, by design, Star Libraries as a percentage of all eligible public libraries has remained static at 3.5 percent—the same ratio as in fall 2009.
If you think the LJ Index of Public Library Service is not useful to your non–Star Library, guess again. There is a multitude of ways in which you can use your library’s LJ Index score and its underlying per capita statistics (circulation, visits, Internet computer use, and program attendance). First, locate your library in this spreadsheet. Review your library’s LJ Index score and the four per capita statistics on which it is based. Consider which statistic(s) is contributing most to, or dragging down, your library’s score. Then, ask the following questions.
Since at least 1987, public libraries have been collecting and using three per capita output measures: circulation, visits, and program attendance. Beginning in 2001, public libraries began reporting uses of electronic resources, and, in 2007, that data element evolved to become uses of public Internet computers. Finally, in 2005, libraries started to report total program attendance. These four per capita statistics comprise the output measures underlying the LJ Index of Public Library Service, on which the annual Star Library ratings are based. While several new data elements have been added since 2007, there have been no new output measures. We are glad to report that the state library agencies recently voted to add circulation of electronic materials, starting with the 2013 data collection. In all likelihood, electronic circulation per capita will join the four current LJ Index statistics. So what’s missing? We believe two output data elements are conspicuously absent from the federal data set: visits to library websites and usage of Wi-Fi access provided by public libraries.
Whether or not your library has been given a star rating, you can benefit from finding peers in your expenditure category and comparing stats. For the scores for all libraries included in this round of the LJ Index, download this document to see the libraries rated, their ratings, and the data from which the ratings were derived.