September 18, 2014

Impact Survey Aims to Help Libraries Increase, Explain Their Worth

All librarians want to serve their communities and patrons as best they can, but knowing how best to provide that service isn’t always easy. The demands of day to day duties, not to mention privacy concerns, can make it hard for staffers to learn the finer details of how patrons are using their services. Without a clear picture of how services like computer access are being put to use, it can be difficult to determine how to fine tune them to the needs of users. Now, researchers at the University of Washington (UW) iSchool have introduced the Impact Survey, a tool that lets patrons anonymously report on how they use library technology while they’re using it, helping librarians understand how—and when—patrons are interacting with the access to technological resources that they provide—and to demonstrate the value of those services to local governments.

The survey asks simple demographic questions like the age of users and what they use their time on library computers for, from getting access to health care and social services to doing freelance work, using database access to work on a term paper, or visiting Facebook to see pictures of grandkids.

The Impact Survey grew out of the 2009 Opportunity for All (OFA) study. Conducted by the University of Washington’s iSchool with assistance from the Gates Foundation, the OFA study took an in-depth look at how library patrons were using public technology in libraries. “We wanted to make the survey into a tool libraries can keep using so they can get these results without building a tool of their own,” said Samantha Becker, the Impact Study’s research project manager. In 2010, she and her colleagues got a grant from the Gates Foundation to do just that.

The result is a plug-and-play survey available to libraries around the country at no cost through July 2014, when a small administration fee will be applied to the service. To start a local impact survey, librarians can plug a short snippet of HTML code into their website, so that patrons who access the site will be given the chance to participate in the survey, whether they do so through a computer at the library or remotely. The data they input bypasses the library, going instead to the UW iSchool, where researchers assemble the collected information from that library’s users into a full report for staffers, detailing the habits of technology users at their branch.

A pilot study of 400 libraries revealed some interesting kernels of data about broad trends among library computer users already, said Becker, including the fact that many users do have Internet access available at home, but that circumstances sometimes make it easier for them to use the computer at the library. “Many households are using public technology because of competition for household access,” said Becker, pointing out that kids can’t do schoolwork on a computer in the home while parents are answering emails from the boss. “People are using the library as a way for everyone to get their work done.”

The outsourced nature of the Impact Survey takes local librarians out of the process, allowing them to learn more about their patrons’ needs without compromising their privacy, traditionally a strong value in librarianship. The data reported back to researchers at UW is stored on secure servers, and as research associates at the iSchool parse the data they receive for each library, they strip out any information that could identify users before they report back to local libraries.

In addition to helping librarians determine where they need to focus their tech support efforts, the resulting data can also be used to demonstrate the value of libraries to people who may not use them regularly. “Digital inclusion services are considered high value services for libraries,” Becker pointed out. But they’re not necessarily the services most readily associated with libraries: in a 2010 OCLC study The Library Brand, the percentage of people who associated libraries with books had actually increased, despite libraries’ diversification into other programs and services. So a tool to help librarians prove  the value of their digital access to local government officials and other stakeholders can be important.

That was the case in Burlington, WA, an hour’s drive north of UW. In 2011, director Maggie Buckholz realized that library needed to improve its technology programming, saying the staff knew people were using the computers they provided, but didn’t really know how. At a small library, though, the library didn’t have the staff or expertise to do so on its own. After a three-week study that saw 105 library users weigh in on how they used public computers, Buckholz said that her staff had their opinions of how patrons use tech transformed.

Where once staffers just saw people using Facebook on library computers, they now found job-hunters, people keeping up on news and current events, and even patrons using library computers to help get their own small businesses off the ground. It especially opened their eyes to the role librarians play as tech trainers for less computer literate patrons. That new information inspired Buckholz and her staff to put in place computer skill training  that has become very popular with patrons.

“We implemented computer teaching schedules, and made teaching tech skills to our patrons an element in every staff member’s job description,” said Buckholz. “We have times that people can book librarians for special help, and we hold ‘Tech Tuesdays’ as a drop-in time where people can drop-in for tech assistance from Burlington librarians.”

Having the data about how patrons were using their services, Buckholz said, also helped Burlington librarians fend off budget cuts during tough economic times. “We weren’t just going to these budget meeting with anecdotal stories or the number of people who came through the door,” Buckholz said. “We could say ’Ten percent of the people using our computers are getting hired during an economic downturn.’” That hard data, Buckholz said, helped her keep at bay budget cuts that would have forced her to lay off staff.

The Impact Survey left its testing phase earlier this month, meaning that any library interested in participating can learn more and sign up through their website. So far, 60 libraries have signed up for the service, but Becker is aiming for a much higher enrollment. “We’re hoping that at least 30-40% of library systems use the survey in our first year,” she said. Researchers will be taking a big picture view toward all the data they process for local libraries along the way, with an eye toward a larger report on their aggregate findings in early 2015. Down the road, Impact Survey researchers are looking for ways to adapt the existing survey to provide similar services fine-tuned to the needs of academic and school libraries as well.

Ian Chant About Ian Chant

Ian Chant is the Associate News Editor of LJ.

Share

Comments

  1. This would be one of the most welcomed initiative for academics libraries such as cegep’s libraries in Quebec (the equivalent of community colleges) -and I guess canadians universities as well- in years. Libraries in cegeps have suffered cuts in staff and budget recently despite their overall use has increased! We find that our institutions directors are poorly informed on how library staff and services support academic success. We need to prove that. Figures on spread sheets are not enough to convince. Impact surveys could be a great ways to put our patrons at front and testify about their library’s experience and how we contribute to their academic successes. So adapting the “existing survey to provide similar services fine-tuned to the needs of academic and school libraries” would be just great. Thank you for keeping us informed!
    Philippe Lavigueur

Comment Policy:
  1. Be respectful, and do not attack the author or other commenters. Take on the idea, not the messenger.
  2. Don't use obscene, profane, or vulgar language.
  3. Stay on point. Comments that stray from the topic at hand may be deleted.

We are not able to monitor every comment that comes through (though some comments with links to multiple URLs are held for spam-check moderation by the system). If you see something objectionable, please let us know. Once a comment has been flagged, a staff member will investigate.

We accept clean XHTML in comments, but don't overdo it and please limit the number of links submitted in your comment. For more info, see the full Terms of Use.

Speak Your Mind

*