October 7, 2015

Find a Library

Whether or not your library has been given a star rating, you can benefit from finding peers in your expenditure category and comparing stats. For the scores for all 7,513 libraries included in this round of the LJ Index, download this document to see the libraries rated, their ratings, and the data from which the ratings were derived:

All Libraries Rated in the LJ Index 2011All Libraries Rated in the LJ Index 2011

If you cannot find your library, please review the criteria for inclusion in the LJ Index, or check the document below, which lists excluded libraries and the reasons for their exclusion:

All Libraries Rated in the LJ Index 2011Libraries excluded from the LJ Index 2011

If your library subscribes to Bibliostat Connect, you can search all of the included library data there.


Ray Lyons & Keith Curry Lance About Ray Lyons & Keith Curry Lance

Ray Lyons (raylyons@gmail.com) is an independent consultant and statistical programmer in Cleveland. His articles on library statistics and assessment have also appeared in Public Library Quarterly, Public Libraries, and Evidence Based Library and Information Practice. He blogs on library statistics and assessment at libperformance.com.
Keith Curry Lance (keithlance@comcast.net) is an independent consultant based in suburban Denver. He also consults with the Colorado-based RSL Research Group. In both capacities, he conducts research on libraries of all types for state library agencies, state library associations, and other library-related organizations. For more information, visit http://www.KeithCurryLance.com.

Craft Exceptional Digital Experiences for Your Users
Digital UX LJ and ER&L present an exceptional roster of library and user experience (UX) experts for our newest online course, Digital UX Workshop: Crafting Exceptional Digital Experiences for the User-Centered Library. During this 5-week online workshop, you will explore why UX matters, and how to sell user-centered design (UCD) to leadership within your organization. Whether you want to redesign your website, revamp your user interface, create a new discovery tool, implement e-resources, or develop a mobile app—you’ll have a tangible product by the end of the course.


  1. Jeff Eide says:

    I appreciate the effort you’ve put into this. However, I have some serious concerns.

    1) Basing the index entirely per capita measures is concerning to me. When you do this, you are really depending on a solid, consistent definition of the legal service area — which, in my experience, is a big assumption. I would suggest that a lot of the libraries that score well do so not necessarily because they are providing such excellent service to their legally defined population, but because they are actually providing service to a larger area & population (due to cross-over agreements, etc). In effect, their legally defined population (and, therefore, their per capita figures) means little. When libraries are claiming per capita visits of 95.8 or 87.9 or 63.9, it really ought to raise some red flags. If these visits are coming from outside the service area, what do the “per capita” figures mean at that point?

    2) It is not clear to me why you have divided the index (and, therefore, the stars) into Expenditure Range categories. One might think that was because it would be unfair for libraries with vastly different amounts of money to have to “compete” against each other. However, if you look at the scores, you see a pretty consistent pattern that the smaller (ie. less well-funded) libraries tend to have higher raw scores. That actually seems backwards to me. In an index that is measuring quality, wouldn’t you expect the better funded libraries to be doing better?

    To take a library in my community, Hennepin County, MN (not my employer, by the way) — they have a very good national reputation, but wouldn’t even make the list in any other population category. Sure, they made the top 15, but that’s not actually all that impressive in a category that only has 48 libraries. With all due respect to the Ida Long Goodman Memorial Library in KS, am I really expected to believe they’re providing better service than Multnomah, Hennepin County, Seattle and, well, every single other library of the 366 with expenditures $5M+? Isn’t that what the higher score would imply?

    I am glad someone is working on a project like this. But, there are some patterns that are concerning. I would love to see my population service area concerns addressed – because I trusting those figures is crucial to the whole project.

    Thank you for you time.

    Jeff Eide
    Ramsey County Public Library

    • Ray Lyons & Keith Curry Lance Keith Curry Lance & Ray Lyons says:

      Jeff, thank you for your thoughtful comments on the LJ Index. The issues you raise about per-capita measures are quite legitimate. Alas, there is little that can be done about them, except to acknowledge them. Per-capita statistics are, indeed, skewed very seriously by situations in which a library’s legal service area population and its actual clientele do not match up. The inclination is to think that these two populations should match up; however, there are many circumstances in which that will never be the case. As you note, extraordinarily high per-capita statistics are definite “red flags” that this is going on. About the only alternative would be to calculate per-borrower statistics, using the figure a library reports for total registered borrowers. Unfortunately, that is an even more “infamously” unreliable figure, which can vary wildly depending on how and how recently the borrower file has been “cleaned,” and–even more problematically in circumstances like this–it is a figure that is extremely easy to manipulate to “game” an index score. In short, it’s not really an option at all. The bottom line on per-capita statistics is that they are far from perfect for just these reasons; but, they are the best thing we have to work with. If you or others have ideas about other alternatives, we would be delighted to hear more from you.

      The other issue you mentioned was that less well-funded libraries often have higher per-capita statistics than their better-funded counterparts. This, too, is often, though not always, true. Most likely, this upside-down seeming circumstance is explained by the multitude of demographic, social, and economic differences between larger and smaller communities. In any event, whatever the causes, it is obvious that it would be unfair to compare libraries with wildly different levels of funding. The only alternative to expenditure-based groups we considered was population-based ones. As you also suggested, this is even more problematic for the reasons discussed above. In addition, there are also many known situations in which libraries with similar legal service area populations have wildly different levels of funding. There is no right or wrong approach on this. We simply felt that expenditure-based groups made more sense, and presented fewer such problems.

      Thanks again for your questions and comments. A project like the LJ Index with its Star Library ratings is always challenging, as there are many decisions to make about how to do it, many of which are sometimes arbitrary judgment calls. What is not arbitrary about the LJ Index is the statistics used to calculate it. They are major library service output measures that have historically stable relationships with one another. We did not choose them arbitrarily. Our fondest wish in regard to this project is that IMLS and the state library agencies will add new output measures (e.g., wifi use, library website visits) to capture new types of library use that probably ought to be included in such an index; but, for which there is currently no available data. Please be part of the discussion about needed new measures with your colleagues and your state library agency.