August 18, 2017

Meaningful Measures | Assessment

Measuring outcomes can be a vital aid to justifying library work to voters, funders, and stakeholders—as well as determining strategic direction—but it can also be overwhelming.

Willie Miller | Movers & Shakers 2016 – Innovators

For Willie Miller, informatics and journalism librarian at Indiana University–Purdue University Indianapolis (IUPUI), Valentine’s Day has become a stealth data-gathering operation. He calls it “guerrilla assessment,” and it has transformed the university library (UL).

Assessing the Ambivalent Liaison | Peer to Peer Review

The drumbeat of assessment has become the cadence of higher education. In libraries, as with any organization, the managerial drive for metrics is reflexive. How do we know if we’re winning? How can we prove it to the boss?

Outcomes, Impacts, and Indicators

The Impact Survey was first used in 2009 to help gather data for the Opportunity for All study reports, conducted by the University of Washington’s iSchool with assistance from the Bill & Melinda Gates Foundation. Libraries were enlisted to connect to a web survey, the results of which were used to augment responses gathered through a telephone-based poll. To our surprise and delight, we gathered more than 45,000 survey responses in just ten weeks, with about 400 libraries participating. Even more delightful was finding that libraries were using the data from Opportunity for All as well as the reports of Impact Survey results from their own ­communities.

Change for Researchers’ Sake | Not Dead Yet

If there’s one word I’d choose as the single most repeated term in libraries over the course of my career (thus far) it would be “change.” And that word has usually had a good connotation for me, since I’ve always figured that if you’re going to change something, you’re going to change it for the better. But now… I’m not so sure.

What Gets You Going and Keeps You Going? | Not Dead Yet

A recent mailing from my library school alma mater (SUNY Albany) brought with it the realization that I’ve been a librarian for quite a long time (36 years and 6 months, to be precise, but who’s counting…), and yet, I feel about my work now very much as I did in my first job as a part-time reference librarian at Union College.

Data & Assessment in Academic Libraries – A free, three-part webcast series, developed in collaboration with ER&L

A free, three-part webcast series, developed in collaboration with ER&L

Building on last year’s Data-Driven Academic Libraries series of webcasts, Data & Assessment in Academic Libraries will focus on projects that range across various service points. Starting with an in-depth focus on qualitative measures used in libraries, the series will move into how data is being used in innovative ways to inform and make changes in information literacy and reference, and then conclude by looking at measures that impact collection development and discovery decisions in the digital environment.

CUNY Helps Libraries Take Stock

On June 6, the City University of New York (CUNY) held its first library assessment conference. Called Reinventing Libraries: Reinventing Assessment, the event grew from its initial target of 100 attendees to almost twice that many, and positive feedback from many attendees included calls for the conference to be repeated, or even turn into an annual event. Several recurring themes became leit motifs running throughout the day: turning from an emphasis on exclusively quantitative to qualitative assessment, libraries partnering with faculty on instruction, and the intersection of outcomes measurement and predictive analytics in a new granular portrait of individual students’ library use.

Library Assessment as Check Mark | Not Dead Yet

Here’s an issue about which I’ve been hearing from colleagues quite a lot lately—that of libraries undertaking and carrying out assessment methods and then ignoring or “trumping” the findings by doing what they wanted to do in the first place, but putting a “check mark” next to assessment in their mental (or literal) to do lists, indicating, “yep, did that!” My thought in such cases is: well, no, you didn’t do that!

What’s Counted and What Counts | Peer to Peer Review

Driven by the demands for assessment and presumably the need for statistics to prove our worth, there’s a tendency to link the importance and appreciation of the library to individual interactions with librarians. Students come to reference desks, chat us up, meet in our offices, each one counted, each one destined to be a tick mark in a spreadsheet somewhere proving how useful we are. That might be why librarians occasionally bemoan lower transaction statistics or the lack of students lined up at the reference desk. Fewer reference questions could mean the students need us less. For a lot of assessment, if it can’t be counted, it doesn’t count.