October 16, 2017

Placements & Salaries 2013: Explore All the Data

Details on jobs and pay for 2012 LIS grads, broken down by region, type of role, school, and more.

Dig through these tables to discover the details about where 2012 LIS grads are landing jobs, at what salaries, and in what kinds of roles, or see the full feature for all the analysis.

CLICK HERE to download a spreadsheet that contains all of the tables seen below. (You may need to right-click this link and save the file to your computer in order to view it.)

This article was published in Library Journal's October 15, 2013 issue. Subscribe today and save up to 35% off the regular subscription rate.

Stephanie L. Maatta About Stephanie L. Maatta

Stephanie Maatta, Ph.D. (es7746@wayne.edu), is an Assistant Professor at Wayne State University School of Library Information Science, Detroit



  1. Promote ideas, not reputations says:

    Regarding Table 3, it is startling to see the difference in placement numbers between the ** schools that provided their own data and the rest of the schools listed.

    Is there any point in including that data in the same table? Those institutions clearly used different methodologies for calculating their (evidently phenomenal) success with graduate placements.

    • In response to the comment made about schools that collect their own data, I thought it would be helpful to provide more detail in an effort to assure readers that the data collected for the University of Wisconsin-Madison School of Library & Information Studies (SLIS) matches that of LJ’s own survey.
      UW-Madison SLIS recreates the LJ Survey, using the exact wording that LJ uses for each question and we include all the questions that LJ asks in its survey. We then send the survey results in an excel file to LJ, with each question and the reported numbers listed.

      We send out the survey directly vs. having LJ send the survey for two reasons:

      (1) We add 2-3 additional questions at the end of the survey that LJ does not ask for that we find helpful in our own continuing improvement of the program

      (2) We get a much better/higher response rate if it is sent directly to the student by staff from SLIS. Our response rate is consistently 70% or higher. A typical survey response rate is closer to 30%. Receiving more responses means more accurate data for our program, and for our current and prospective students. Unfortunately, LJ doesn’t print response rates for each of the schools listed, which could be more helpful to the reader in determining how accurate the results printed are.

      Tanya Cobb, Student & Alumni Services Coordinator, UW-Madison SLIS

  2. I could be wrong, but these images seem like JPEGs. Is it possible to present this information so those who are using screen readers (whether for visual disability or other reasons) can access it? Or a link to where the tables might be presented in readable form?

    • Meredith Schwartz Meredith Schwartz says:

      Hi Katie,

      We’ve added a link to download the spreadsheet. Hope this helps!


  3. Matt Marsteller says:

    Why aren’t all ALA accredited programs included? I can understand it being difficult if the institution won’t respond, but in some cases you’ve contacted the students and have provided some data.

    • Meredith Schwartz Meredith Schwartz says:

      Hi Matt,

      LJ relies on the schools to contact the students. We invite all students to share data whether or not their school participates; some of them contact us, which explains the data we provided, but we don’t have a direct line to reach out to them independently.

  4. Are only FT positions included? Or are the salaries extrapolated from the hourly wage? Because there is a HUGE difference.

  5. How ironic that information on salary and placement of information professionals was compiled and presented so poorly.

  6. Hello,
    Are the raw data available from this? Not the summary spreadsheet as provided, but rather the detailed response data?

    There are some figures on this that are presenting statistically dubious information and there should be more analysis on central tendency.

    Also, the totals on the first 4 data columns of table 2 are wrong. Left to right they should be 1528, 776, 195, 977.


    • I’m getting wrong totals for the entire bottom row of Table 1. When I add the columns, I get 1862, 984, 169, 235, 1454, 152, and 257. There is a note at the bottom of the table explaining that the tables “do not always add up, individually or collectively, since both schools and individuals omitted data in some cases.” But shouldn’t the columns always add up? These totals are just counting the number of respondents in each category.

    • And for Table 6, I’m getting a total of 1506 for the first column. Again, shouldn’t that column total in a pretty straightforward way? What am I not getting about how these numbers are being interpreted and presented?

    • Maybe I’m not reading this right, but I’m getting different numbers on all but two of the total placements by type of organization. Here are my numbers that are different from the ones displayed in the table: All Public: 404; All School: 143; All Academic: 339; All Special: 74; and All Government: 34. For the last two, All Archives and All Vendors, I get the same total displayed on the table.

  7. Sorry, don’t mean to beat a dead horse, but one more strange total in Table 7, for “Other Organizations.” I have added the numbers in the first column up a couple of times, and I get 276 (vs. 334 shown in the table). That’s pretty far off. Is it just so late in the workday on a Monday that I’ve become completely incoherent?

  8. P. Granger says:

    Hi, is it possible for Ms. Maatta or someone from LJ to respond to these comments about the columns not adding up correctly and the disparity in the data between schools that responded to the survey and those that didn’t? And, as Chad asked, is it possible to obtain the raw data? I find it troubling that LJ puts this information that affects livelihoods and reputations out into the world and will not provide responses to some really fundamental questions.

    • Meredith Schwartz Meredith Schwartz says:

      Ms. Maata has been contacted and will respond as soon as possible; unfortunately she is dealing with a personal issue, so there may be some delay.

  9. Steve Neff says:

    Each October SLA conducts a survey of placements this year the overall placement rate is 26% 6184 graduates and 1648 placements. It would be interesting to do a follow-up study a year later on Library graduates performance instead of 5 months since graduation do 18 months graduation. Even without additional surveys these results are poor. A prison rehabilitation program that had a 26% placement rate I would question the management of the program. It seems that no alarm bells are being raised by the American Library Science Association the agency that accredits these programs.

  10. As an unemployed recent graduate (May 2013) I can say I haven’t received any survey from any source to complete as of yet. I know this is from 2012, but if they do this every year then I question their information gathering. I understand it’s difficult to get people to respond to surveys, but information this incomplete isn’t really of any help to anyone.

    There are so few responses this data more or less useless. From what I gather they had a response rate of about 29%, meaning for 71% of the graduate population they have absolutely no clue whether they are employed or not. But this gives the false impression that 74% of new 2012 graduates are unemployed. When I do the math and take only the number of respondents with the number of employed, I come up with a much more reasonable unemployment rate of 3-4% (which would be inline with national averages for those with higher education).