Whether a library is designing a building or a program, the first premise of designing for impact is figuring out what impact you’re trying to make and how you’re going to assess whether that impact is occurring. One of the most common buzzwords in librarianship today is “outcomes, not outputs.” In other words, measuring not quantitative metrics of what libraries do, such as circulation or visits, but what impact those activities have on the lives of their patrons.
Monroe County Library System (MCLS) and Rochester Public Library (RPL), NY, director Patricia Uttaro credits LJ’s Lead the Change event with turning multiple small projects that had been happening across the district into a more cohesive structure capable of affecting broader change.
This past December, LJ teamed up with Electronic Resources and Libraries (ER&L) to dive deep into the use of data-driven decision-making in academic libraries in a series of three free webcasts. The series, moderated by Bonnie Tijerina, head of e-resources and serials at Harvard Library and ER&L conference coordinator—and made possible thanks to sponsorship by ProQuest, Springer, and Innovative Interfaces—explored a range of strategies academic libraries are deploying as they use data to serve their customers more effectively.
Let me start out by acknowledging that “Science and Religion in the Library” is a provocative subtitle, and to some degree it’s meant to be. Let me explain what I mean by it. For my purposes here, I’m going to define as “science” those aspects of library work that deal with figuring out and describing things as they are, and as “religion” those that deal with figuring out how things should be and why they should be that way. In the sense that I’m using the terms here, science is descriptive, and religion is prescriptive; science is involved with “is” questions, while religion is involved with “should” questions.
Jamie LaRue, an erstwhile public librarian (recently turned consultant) in Colorado who has done some cool things (such as negotiating directly with publishers for ebooks while refusing to pay crazy amounts for popular titles), has thought-provoking things to say about the dynamics of change in libraries. Reflecting on a discussion at the Arizona Library Association where something he said apparently raised eyebrows, he expanded on his remarks in a blog post, taking particular aim at a pattern he sees (and many of us will recognize) in library organizations. A decision is made, a direction taken, and then the sabotage begins, conducted by people who contributed little to the discussion as the decision was being made.
Here’s an issue about which I’ve been hearing from colleagues quite a lot lately—that of libraries undertaking and carrying out assessment methods and then ignoring or “trumping” the findings by doing what they wanted to do in the first place, but putting a “check mark” next to assessment in their mental (or literal) to do lists, indicating, “yep, did that!” My thought in such cases is: well, no, you didn’t do that!
The constellation of Star Libraries changes dramatically from year to year. As it does every year, the 2013 Star Libraries illustrates that each annual round introduces a substantial set of new Star Libraries, sees the fortunes of continuing Star Libraries change—as libraries change peer groups and gain and lose stars—and, indeed, sees many of the previous year’s honorees lose their Star Library status altogether. The explanations for these changes are varied and complex. Whether a public library gains or loses Star Library status or sees that status change more subtly is determined as much by the fortunes of other libraries in a library’s spending peer group as by the per capita service output of its own institution. In this year’s article, we will highlight the new Star Libraries that were not on the 2012 list, Star Libraries that maintained their star status despite changing spending peer groups, Star Libraries that gained or lost stars from 2012 to 2013, and libraries that lost Star Library status in 2013.