April 19, 2018

Digital Inclusion Survey: Renovation Matters, Help Happens at Point of Need, and Staff Still Do (Almost) Everything

digital-inclusion-nebraska-daleclarkIn October, the Information Policy & Access Center at University of Maryland (iPAC) and the American Library Association (ALA) released the results and initial analysis from the 2014 Digital Inclusion Survey. iPAC has gathered statistics on public libraries and the Internet for 20 years, and this report highlights the sea change over that time. Of particular note, this survey looked more closely at the relationship between recent renovations or construction (within the past five years) and the ability of libraries to support a full and robust online life for all of a community’s residents—regardless of age, education, and socioeconomic status—by providing free access to public access technologies (hardware, software, high-speed Internet connectivity); a range of digital content; digital literacy services that assist individuals in navigating, understanding, evaluating, and creating content using a range of information and communications technologies; and programs and services around key community need areas such as health and wellness, education, employment and workforce development, and civic engagement.

The report presents clear statistics on what libraries are doing already, and how many of those offerings are presented by library staff themselves, including technology training and formal programs extending existing educational systems. It also presents casemaking materials for libraries looking to expand their programs and services around digital inclusion, including:

  • Data to support a library renovation or infrastructure improvement proposal
  • Reasons to support staff training initiatives and/or data to encourage new community partnerships
  • Talking points to use with community leaders to encourage additional support for ways to make the community more attractive to younger/more affluent populations
  • Benchmarks to compare each organization to its own state and national trends, including ideas for surveys to help define a focus

In addition to the full report, iPAC/ALA have published topic-specific briefs that make points quickly and are easy to hand to trustees and community leaders along with a request.

The briefs and Extended Summary rely heavily on the intersection between the Digital Inclusion Survey results and the Libraries at the Crossroads user report from Pew, published this September. The Pew survey asked many of the questions that appear on the Digital Inclusion Survey, but from a user’s perspective. These two reports work together to show that while library circulation and program attendance are slightly declining, use of remote/online services and the use of library buildings as places to hang out, get work done, and get questions answered is increasing. The lower overall formal program numbers and higher point-of-use percentages seem to bear this out.

Renovation => Innovation

One of the clearest results of the survey is that while libraries can keep doing what they’re already doing in the buildings they have, upgrades radically improve their ability to expand their services and programs. In addition to being able to upgrade or accommodate more public computers, infrastructure improvements that come with renovations offer library users more outlets for mobile devices and laptops, stronger Wi-Fi to handle higher bandwidth usage (including streaming media), and space that shifts from materials storage to flexible use. The Crossroads study showed an increase in users who go to the library just to “hang out”; if you walk around your library, you’ll see that many of those users are “hanging out” on their mobile devices and watching media on them.

Of those surveyed, the smallest differences between renovated and unrenovated libraries were in the programs/services most tied to physical collections and ones that libraries have been doing the longest: summer reading, homework help, reference, etc. In contrast, there was a 10–15 percent improvement for renovated libraries in providing access to e-government services and health-related information, and providing workspace for mobile workers.

On a related note, 87.6 percent of libraries rated their buildings as “poor or fair for Maker spaces.” Although not every library wants or needs a Maker space per se, the same infrastructure improvements that would make a building suitable for a Maker space would improve its ability to support the activities listed above.

Staff Training and Local Partnerships

A second highlight of the survey has much less visibility in the briefs and summaries: library staff continue to offer the lion’s share of help and education to the public at the moment of need—during reference interactions, circulation transactions, and informational conversations. With the shift to online interfaces with government, health agencies, employment resources, and education programs, informal point of use assistance becomes a training opportunity for basic computer use, Internet skills, online safety, and digital content creation and sharing.

In the questions about tech training, this distinction is striking. For basic or Internet-based skills, an average of 75 percent of this work is done informally at point of use, compared with 40–50 percent in formal classes, with a spike of 87.5 percent at point of use vs. 11.7 percent in informal classes for assistive technologies. For example, library staff used to just hand folks IRS forms and they’d take them home to fill out. As the IRS has shifted to online tax filing, library staff now help users navigate the IRS site, download the forms or fill them out online, answer the questions that invariably come up during the process, and help them print and/or email their returns for their records. What started as a simple directional transaction has become a much more complex informational transaction.

In contrast, advanced topics like web development and digital content creation are more evenly spread between informal and formal help: 51.2 percent vs. 44 percent for web development and 57.6 percent vs 53.3 percent for digital content.

Overall, library staff offered more than 90 percent of these programs, compared with 7–15 percent by volunteers or partner organizations. This changed for advanced topics: 79.9 percent library staff vs 21.9 percent partner organizations for web development and 83.5 percent library staff vs 26.9 percent by partners for digital content creation.

Depending on each library’s needs, this data can support two very different kinds of casemaking for libraries and librarians: improved training for staff, and increased outreach to partnering organizations.

If 75 percent of the assistance libraries offer is at the point of use, staff can’t “prepare” for that the same way that they prep for a program or class. Public-facing staff need have to have consistent, periodic training that will help them answer these questions in the moment, supported by adequate funding to bring qualified trainers in or send staff to meet them (whether virtually or in person) and at-work staff time to attend training.

Since library staff are already offering more than 90 percent of both formal and informal tech training themselves, if libraries want to increase the amount or type of tech-focused programming they offer without adding staff, it has to come from partners in the community. This data can support appeals to trustees and other local officials, asking them to spread the word that libraries want to do more, but need more help from other organizations in the community to do so.

“Hidden” Data

Subtle differences in the way that both the Libraries at the Crossroads and Digital Inclusion Survey were written and analyzed seem to have contributed to useful data being unintentionally “hidden.” For example, in the Crossroads survey there’s a question about how often users go to the library to ask for help on e-government and other online institutional resources and forms. The Crossroads report, which focuses on formal programs, indicates that people don’t go to the library very often for this kind of help, but the Digital Inclusion report says that this kind of help happens most often at point of need/use. What this hides is that it’s the people who are using the library to access these resources who seek that help, not people attending programs.

Again taking tax forms as an example, someone doing their taxes at home usually won’t go to the library to ask for help. They’ll use the instructions, phone assistance, or online resources to find answers. In contrast, someone sitting at a library computer to submit their forms will absolutely ask library staff for help. Neither person is likely to attend a class on submitting taxes online, but people already in the building will ask for help at the point of need while the person at home might ask for help on the library’s chat reference or look up tax tips on the library website. In neither case is this a “program,” which means that these services can be hidden inside general reference and website statistics.

Using a combination of these two reports, library directors can reveal “hidden” usage trends to community leaders, supporting requests for renovations, staff training, and local partnerships.

New tools

The briefs and reports generated by iPAC can be handed directly to relevant stakeholders to support requests for anything from renovations to improved staff training. The executive summary is a convenient two-pager to include as an addendum to a proposal or hand out at a meeting, while the topic briefs are four-pagers that focus on more specific areas. The full report can feel overwhelming in terms of data, but it clearly shows detailed comparisons on particular questions and across different library types: city, suburban, town, and rural. The survey questions themselves (in the full report) can be used to create surveys that will compare an individual library more closely to the national numbers, or as inspirations for more specific surveys about patrons’ needs.

A second tool is the interactive data tool available at digitalinclusion.umd.edu/content/interactive-map. Users can start with a search for either a city/town or a specific library and see its survey results (if any), then use the tabs and overlays to look at statistics for local economics, demographics, educational levels, and more. A screenshot of the map can add a visual element to a request or report, especially for libraries that participated in the survey.


What Comes Next?

One major unanswered question is: Why aren’t libraries offering more training as formal programs? Are they lacking in staff skills, space limits, technology limits, time limits, programming funds limits? Or is the problem a lack of community demand? If the latter, libraries might consider diverting those resources to offering more point of use help. Or does the infrastructure need upgrading, and once the library is renovated or the technology updated will be able to support more programming?

For a deeper dive into the data and what it means for libraries, Larra Clark, Deputy Director of ALA’s Office of Information Technology Policy, and John Carlo Bertot, Digital Inclusion Survey lead researcher and co-director of the Information Policy & Access Center at the University of Maryland, will explore the results of the Digital Inclusion Survey further in a series of posts on ALA’s District Dispatch blog.

Building Literacy-Rich Communities
Hosted by Library Journal and School Library Journal, Stronger Together is a national gathering of thought leaders and innovators from across the country who will share where and how partnerships between school districts and public libraries are having success. Join us May 10–12 at the University of Nebraska Omaha, as we explore the impact these collaborations are having on the institutions, communities, and kids they serve.


  1. Jennifer: Thanks for the detailed and thoughtful analysis of our study! A quick update that may be of interest: the interactive mapping tool (national and state level maps on the state pages) has now been updated with new features – one in particular allows libraries to either update selected existing data if they participated in the survey or add selected data if they did not. Updated/added data will show up in the map and tagged as such.

    • That’s great to hear, John – I was hoping that there’d be a way for libraries who didn’t know about the survey to add their data afterwards. It’s a good tool and I think that features like that will make it more robust as the word gets out about it. Thank you for all your work on this project!

  2. Remember how the five-year meme at the beginning of the millennium was that libraries as a “place” were dying? Yeah. Which is why I no longer pay attention to futurists, consultants, or “change agents”.