September 16, 2014

Taylorism Comes to Campus | From the Bell Tower

steven bell newswire Taylorism Comes to Campus | From the Bell TowerJust because technology allows us to do something, should we? That’s a big question being asked in higher education when it comes to student performance tracking analytics and predictive analytics.

If you could create a new service called “The Library That Learns You,” would you do it? Imagine combining several emerging technologies, such as artificial intelligence agents, the Internet of Things, and wearable computers to build the capacity for a highly personalized library experience—but also an analytical system for identifying at-risk students who, at the point of need, could receive highly customized support from an academic librarian. Think of it as a more highly evolved embedded/personal librarian service. The library’s presence goes beyond just a link in a course, an announcement from a librarian, or an annual greeting welcoming a student to campus. The academic library lives in the student’s data core, intermingling with his or her devices, connected to his or her academic records, and having the ability to predict what the library can deliver next.

Here’s an interview with a student from the future commenting on their reaction to “The Library That Learns You.”

Positive, Negative, or Somewhere In Between

What’s your immediate reaction? Those having a positive reaction may appreciate the value in using technology to help students succeed. That might be particularly important to academic librarians at community colleges or universities with large segments of low-income and first-generation students where retention rates are low. To help more of those students persist to graduation, we might be willing to use any technology resources at our disposal—especially as the latest data indicates that the number of students who fail to persist after the first year is on the rise. Those who had a negative reaction or were feeling some level of discomfort no doubt worry how such technologies invade and compromise personal privacy. There is also likely some middle ground where we can see the possibility for both good and bad outcomes. We want to give our students every advantage, but at what cost—and are we willing to pay the price?

Already There?

However you reacted, the reality is that high-tech performance monitoring systems will become commonplace in the not too distant future. From the cradle to the grave, parents, educators, and employers will leverage monitoring and assessment technology to help us perform at our best, guide us to make the right decisions, and protect us from the unseen and unknown dangers that could derail us from the tracks of security and success. The growing popularity of wearable fitness devices points to a willingness to subject ourselves to performance monitoring if we believe it will produce a desired outcome. When the data from a day of activity is uploaded to the cloud, do we know exactly where it’s going, who can access the data, or the degree to which its confidentiality is protected?

Surveillance technology is also being applied in the workplace, mostly in the service and retail industries, to monitor and improve employee performance as well as allow for the detection of theft or unethical behavior. Research from 392 restaurants that installed these systems reported that loss from thefts had moderately declined, but, more significantly, the sales per server increased dramatically. Knowing they were being monitored, servers worked more diligently to sell extra appetizers, drinks, and desserts. Owners received more profits, servers received bigger tips, and customers received better service. Who else was a winner? It also allowed the system owners to collect vast amounts of data to use for their restaurant consulting services. As the products and services we use all become more “intelligent,” data will be collected about and from us in ways we can hardly imagine today. By 2025, every automobile manufacturer will produce “connected cars” that collect data about our driving habits, destinations, and system performance. Would you be opposed to getting a text message from your car pestering you to stop procrastinating in getting the oil changed? Helpful, maybe, but at what cost to your privacy?

Asking the Right Questions

Why, now, do humans need to be monitored to help them become better students or workers? What’s wrong with allowing students to experience college without a safety net there to save them whenever they lose their balance? John Warner smartly tackles these questions in his essay “The Costs of Big Data.” Reacting to a piece by Anya Kamenetz about the Course Signals system at Purdue, an analytical performance-tracking warning system, Warner asks what it is we really want for our students when it comes to success. Just graduating? He writes, “What if we worry that their adult lives will not come with Course Signal warnings? And mostly, what if we worry that this institutional focus on capturing and employing data distracts us from what is most meaningful about the college experience…maybe tells students that they are a data point. Or maybe Course Signals becomes a crutch, substituting tips and tricks for in-depth human interaction, the kind we know alters lives.”

We need to have critics question the value or necessity of performance tracking and analytics systems, not only because of the mishandling of data and privacy intrusions but because of the unknown consequences it may have for our students. What if it helps them survive college but not the real world of work they’ll likely face where no one is helping them avoid failure—or will it be the world where constant monitoring, data collection, and analysis is just the way of life?

Calming the Fears

Knowing that there are ways in which collecting and using student data could prove beneficial, perhaps we need to refrain from immediately writing off monitoring and preemptive warning systems as dangerous technologies. To that end, how student data is or will be used is a growing area of debate at all levels of the American education system. Repeated large-scale data mishandling and privacy intrusion incidents should rightfully have us questioning why feeding student data into these systems is a practice worth even considering.

In his article “Reframing the Data Debate,” Steve Rappaport acknowledges this when he states, “Fears about misuses of student data feed into larger narratives about dangers to privacy and the security of data fueled by revelations about the NSA, Target, etc., and their fervor makes it impossible to dismiss them as ill-informed rants.” He goes on to remind us that progress in education at all levels has always depended on the collection of student data. Though he represents the interests of educational technology firms that produce the learning products consumed by students, Rappaport writes that those firms must clean up their acts and demonstrate that they can calm the fear by making sure student data is secure and that privacy rights are respected. That sounds good, but can we trust the EdTech industry to do the right thing?

Setting Limits and Sensible Choices

Perhaps the use of digital tools for tracking, monitoring, and performance assessment, all intended to facilitate predictive analytics, is neither good nor bad. They are tools we have at our disposal to allow us to accomplish something helpful but could have unintended consequences that would lead to harmful results. It’s up to us to determine the level at which we implement and apply the tools and to understand fully the context for their use. While I think it’s interesting and builds on a growing personalization of service trends in academic librarianship, I’m personally uncertain about a “Library That Learns You” service. While I think some students would find it valuable, and it could possibly shift the odds of success to the student’s favor, it hardly seems like our preferred mode of interaction. Just because you could put a robot at your reference desk, would you do it? It may sound awful now, but in 25 to 30 years when it’s technologically possible, perhaps it will be just one more user expectation, not unlike expecting to find a café in today’s library.

Learning From the Past

I don’t have the answers. What I do believe is that, over the next 20 or 30 years, our profession will be greatly challenged in this whole environment of student data. Some of the pressure to participate in these systems will come from our own academic administrations as they seek to improve student performance, lower student debt, and achieve the metrics required by emerging government standards. At one point in time, Taylorism was a respected method for improving the workplace and outcomes. Looking back, we now know that imposing scientific management achieved great efficiencies but did so at the cost of destroying worker morale. We will need to be careful not to repeat the mistakes of the past when it comes to deploying technology with the good intention of helping our students achieve short-term results when it is not clear to us, in the long run, how it will truly impact them.

This article was featured in Library Journal's Academic Newswire enewsletter. Subscribe today to have more articles like this delivered to your inbox for free.

Steven Bell About Steven Bell

Steven Bell, Associate University Librarian, Temple University, Philadelphia, PA, is the current vice president/president-elect of ACRL. For more from Steven visit his blogs, Kept-Up Academic Librarian, ACRLog and Designing Better Libraries or visit his website.

Share

Comments

  1. Yes! The Library That Learns You is very close to “My Ideal Library App” that I’ve been hoping for : highly personalized, productive, and social. (see: http://ganski.wordpress.com/2012/11/29/my-ideal-library-app/)

    Is Temple really working on this?
    Kate

  2. steven bell says:

    Wouldn’t we like to find that killer app for academic libraries.

    The library that learns you is a personalization concept. It’s not something we are currently working on, and I would say that technologies that would be needed to make it possible are happening but are not yet where they need to be. So something like this could be a few years off.

    There are thing we can do now to develop some more personalized services, but I don’t think we can put it in the form of an app just yet.

    You, Kate, seem like you would be fine with some of the data issues and concerns I discuss in the column.

    Your idea about an app makes this type of personalized service – that depends on gathering data about students – an opt-in service. They can choose to utilize it if they think it will help them – or pass on it if they have concerns about how their data might be used.

    Thanks for your comment and perhaps we’ll see something like the app you describe in the not too distant future.

  3. Barbara Fister says:

    I’m okay with a librarian who knows me. I don’t want a library to do so, because that means data is collected by third parties and algorithms rather than human beings are making recommendations, and I think humans are much better at it. I am also concerned about some of the underlying assumptions or goals of proponents of big data analysis.

    I really liked the way Kelly Jensen at Book Riot constrasted agorithms with people and reaching out instead of reaching in in her post “Libraries are Not a ‘Netflix for Books.’” http://bookriot.com/2014/07/15/libraries-netflix-books/

    Thanks for being thoughtful about the issues, Steven.

    • stevenb says:

      Thank you for sharing your concerns Barbara, and I do share them – having written about the promises and pitfalls of big data (http://lj.libraryjournal.com/2013/03/opinion/steven-bell/promise-and-problems-of-big-data-from-the-bell-tower/). I also agree that it would be much better to establish personal connections with students and faculty. The challenge is that with 20,000 undergrads, many of them first-gen or low-income who are potentially at-risk, these algorithms – while they have issues – may help some of these students get past the struggles they may face. I like what Roger Martin has to say about algorithms – that once we create them, if they work well, we stop thinking about the dangers they present.

  4. Not one pixel in this article addressed the impact of such a library surveillance suite on the Library Bill of Rights, or the potential impacts on intellectual freedom. I just can’t take any conversation that avoids the topic seriously.

    We protect intellectual freedom. The argument against library surveillance (again, your phrasing Steven) is that the use of such tools endangers intellectual freedom and abrogates our responsibilities as libraries and librarians. This is not an issue of personal wants and wishes and possibilities; it’s an actual professional obligation under dismissal.

    If that issue can be resolved, then a conversation can happen. Until then, it’s an irresponsible professional conversation that fails to take up the essential issues. .

    • stevenb says:

      Thank you for your comment. I don’t use the phrasing “library surveillance” in this column.

      You point out a matter of concern. One thought is that the technology may advance and allow us to ensure that no student’s IF rights are violated. Students might also choose to opt into such services, knowing full well the possible consequences, because they believe it will give them an academic advantage.

  5. Lisa Hinchliffe says:

    A robot at the reference desk isn’t a 25-30 year out possibility. Stella (a bot) is online (since 2004 I believe) … though you have to talk to her in German. I used Google translate to change “when is the library open” into German and was helpfully provided with this: http://www.sub.uni-hamburg.de/service/wann-wo/oeffnungszeiten.html?aufruf I understand there are other library bots around the world…

    • stevenb says:

      Thanks for bringing this to the attention of the readers Lisa. While I agree that my timeline is probably too lengthy, the type of reference robot I had in mind is still a few years away. I’m thinking more of an actual robot that sits behind a desk – or roams the library – or visits your class. This German robotic application shows that the possibilities for the artificial intelligence of advanced robotics is here today.

Comment Policy:
  1. Be respectful, and do not attack the author or other commenters. Take on the idea, not the messenger.
  2. Don't use obscene, profane, or vulgar language.
  3. Stay on point. Comments that stray from the topic at hand may be deleted.

We are not able to monitor every comment that comes through (though some comments with links to multiple URLs are held for spam-check moderation by the system). If you see something objectionable, please let us know. Once a comment has been flagged, a staff member will investigate.

We accept clean XHTML in comments, but don't overdo it and please limit the number of links submitted in your comment. For more info, see the full Terms of Use.

Speak Your Mind

*