March 21, 2018

Information Literacy as an Unnatural State | Peer to Peer Review

Wayne-Bivens-Tatum-newswireLibrarians tend to view information literacy in light of the ACRL Information Literacy Competency Standards. Information literacy is a set of competencies, a set of things we should be able to do. If you’re information literate, you should be able to, among other things:

  • Determine the extent of information needed
  • Access the needed information effectively and efficiently
  • Evaluate information and its sources critically
  • Incorporate selected information into one’s knowledge base

These all sound like good, sensible outcomes for a sounder higher education. However, one of the many problems with becoming information literate in any robust sense is that it’s completely unnatural. The entire enterprise goes against the way the human mind tends to gather and use information. Human beings are animals perhaps capable of information literacy but apparently designed to work in other ways.

You probably didn’t balk at that last sentence at all. Human beings are designed to work in other ways. You might think human beings are “designed.” But did you come to that belief through a neutral and critical evaluation of the available evidence or through some other route? If you approached and evaluated the evidence for such design critically, you might be much less sure of that belief and conclude, along with the scientific consensus, that humans are the product of evolution and not design.

However, regardless of whether it is “designed” or not, “the human mind is highly prone to detecting agency,” according to an article in the recent book The Philosophy of Pseudoscience, “and it often does so even in the absence of agents.” We might think a person is nearby even if it’s only the wind. From an evolutionary perspective, in our history “it is far less costly to assume that there is an agent when there is none than the other way around.” After all, if it’s the wind and we’re temporarily wary it’s a person behind us, that’s better than if we ignore the sound and get attacked. We think we’re designed because we want to attribute everything to a designing agent. It’s just natural.

Even if you don’t think we’re designed, you’re probably comfortable with that kind of language, and language shapes our thought. In this case, the language is in line with what Robert McCauley calls “maturational naturalness,” the sort of thinking that comes naturally just because we grow up. We’re comfortable with attributing agency even where it doesn’t exist, because that’s just the way we work. Some skills, like information literacy, can be the result of practiced naturalness, if in fact people practice them.

The literature on pseudoscience is rife with examples of flawed reasoning, but the flawed reasoning is a result of our natural thought patterns. Scientific reasoning and critical thinking, the motors behind information literacy and the academic enterprise, are learned and rare, which is why most of us reason poorly much of the time and all of us reason poorly some of the time.

Aristotle wrote that humans by nature desire to know, but that doesn’t seem to be true. Psychologists studying how the mind functions might say instead that humans by nature desire to interpret the world in a way that makes sense to them and makes them feel good about themselves, regardless of the facts. It’s called motivated thinking, and one review of the literature claims that “individuals’ preferences for certain outcomes are believed to often shape their thinking so as to all but guarantee that they find a way to believe, decide, and justify whatever they like.” Consider that in relation to the task of teaching information literacy.

Such thinking is especially prevalent around issues of great importance to us, such as politics or religion. In every area of belief, we want to be right, or at least considered right, but when it comes to beliefs core to our definitions of ourselves, we are highly resistant to alternative beliefs. People are more likely to evaluate positively information that confirms their beliefs and spend more time criticizing information that challenges them. Not only that, but some studies show that when confronted with strong evidence that their beliefs are mistaken, people tend to hold those beliefs even more strongly. It doesn’t matter how rigorous or scientific the information is. What matters most is their previous beliefs and how the new information makes them feel about themselves.

There are various names for these flaws in critical thinking. A study on “motivated skepticism in the evaluation of political beliefs” focused on the following: motivated skepticism, confirmation bias, disconfirmation bias, prior attitude effect, selective exposure, attitude polarization, and cognitive dissonance. We naturally do everything we can to avoid changing our minds and everything we can to make ourselves look better.

However, these behaviors don’t just apply to politics or religion. A recent study from Finland on “core knowledge confusions among university students” found that even university-educated students had trouble “in fully differentiating the core ontology of physical, biological, and mental phenomena.” For example, “children may construe almost anything as animate,” as if the moon were a living being because it “moves” across the sky. Adults aren’t necessarily that much better, even educated ones. Students were given 30 statements such as “plants know the seasons” or “furniture wants a home.” “Half of the participants considered at least four, and one quarter of the participants considered eight to 30 statements to be literally true.” That’s literally literally, not figuratively literally, as in such common statements as “that movie literally scared me to death.”

In addition, consider Daniel Kahneman’s and others’ work on slow thinking and fast thinking, showing that quick intuitive thinking comes very naturally to us and is often inferior when considering anything that requires more complex thought. However, the slower, more complex thought is difficult and indeed physically draining.

The natural working of the human mind explains why so many people believe in astrology, crystal healing, or homeopathy, despite the lack of evidence that they work. Even trained scientists and academics aren’t immune to these problems and are often guilty of confirmation bias or the use of selective evidence.

Information literacy in a strong sense is deeply unnatural, and yet we task ourselves with teaching it. Sometimes we might feel bad for not accomplishing more, but given the workings of the human mind, when it comes to teaching information literacy, it’s amazing if we accomplish anything at all.

Wayne Bivens-Tatum About Wayne Bivens-Tatum

Wayne Bivens-Tatum ( is the Philosophy and Religion Librarian at Princeton University and an adjunct instructor at the University of Illinois Graduate School of Library and Information Science. He blogs at Academic Librarian.

The Latest Trends in Library Design
Hosted in partnership with Salt Lake County Library and The City Library—at SLCo’s Viridian Center—the newest installment of our library building and design event will let you dig deep with architects, librarians, and vendors to explore building, renovating, and retrofitting spaces to better engage your community.
Facts Matter: Information Literacy for the Real World
Libraries and news organizations are joining forces in a variety of ways to promote news literacy, create innovative community programming, and help patrons/students identify misinformation. This online course will teach you how to partner with local news organizations to promote news literacy through a range of programs—including a citizen journalism hub at your library.


  1. For the curious, some of the sources referenced in the column:

    “Motivated Thinking.” In The Cambridge Handbook of Thinking and Reasoning / Morrison, Robert G.,; 1966-, 295–317. New York: Cambridge University Press, 2005.

    Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.

    Lindeman, Marjaana, et al. “Core Knowledge Confusions Among University Students.” Science & Education 20, no. 5–6 (May 1, 2011): 439–451.

    McCauley, Robert N. Why Religion Is Natural and Science Is Not. New York: Oxford University Press, 2011.

    Pigliucci, Massimo and Maartin Boudry, eds. Philosophy of Pseudoscience: Reconsidering the Demarcation Problem. Chicago: The University of Chicago Press, 2013.

    Taber, Charles S., and Milton Lodge. “Motivated Skepticism in the Evaluation of Political Beliefs.” American Journal of Political Science 50, no. 3 (July 1, 2006): 755.

  2. As an instruction librarian, I’m oddly comforted by all this. I’ve had fairly good luck getting my students to understand and apply what I’m teaching them, but it is ABSOLUTELY an unnatural skill, and doesn’t come easily to most people, especially 18 year old college freshmen who have very little incentive to care deeply about web evaluation or the origin of scholarly sources.

  3. Thank you for being truthful about this IL nonsense. I read Stanley Wilder’s 2005 article- Information Literacy Makes All the Wrong Assumptions article this week, and I cannot believe librarians are spinning their wheels with one-shot instruction and even throwing in “assessment” of these so called instruction sessions. Even the term “information literacy” is offensive. As an academic librarian for seven years, I am ashamed that the profession so desperately is hanging on to information literacy as a form of survival, when in fact the entire premise of information literacy is destroying our roles in academia.

    • Rebecca, I’m not sure I consider it nonsense as such, but I very much disagree with the way IL, whatever it might be, has been pushed by librarians as something where librarians play an especially significant role, when if there is such a thing as IL (or what I’m tentatively calling “scholarly habitude”), it’s more like the goal of a liberal education than anything particularly library-related.

    • I respectfully disagree that IL is nonsense. I do agree that info lit (or whatever label you want to put onto the amorphous, unstructured set of critical thinking skills it attempts to frame) is not the sole territory of librarians and I would suggest that most instruction librarians know this, even if we aren’t so eager to admit it publicly. Still, this doesn’t mean we have no part in helping students become reflective, critical thinkers and by extension, succeed academically (which I would view as major parts of higher ed). In my view, IL is an attempt to fill in some very real gaps in the learning processes of many of our students in this regard. While content experts (faculty) and students’ interaction with content and learning activity play the major role in this, we still have a part to play. How students can effectively interact with and integrate information into their thinking practices is not something most faculty or course content, right or wrong, devote class time to. Have some views of IL overreached and, in effect, placed the librarian at the center of it’s processes? Yes. Are some of these skills unnatural? Perhaps. But then again, formal education as a whole might be viewed in that context as well. I’d much rather have us contribute to the attempt to make students aware of some of these unnatural ways of thinking and interacting with the world around them, than continue leaving those gaps in our current system ignored altogether.

    • Dana, you get no argument from me.

  4. Wayne,

    “Aristotle wrote that humans by nature desire to know, but that doesn’t seem to be true. Psychologists studying how the mind functions might say instead that humans by nature desire to interpret the world in a way that makes sense to them and makes them feel good about themselves, regardless of the facts. It’s called motivated thinking, and one review of the literature claims that “individuals’ preferences for certain outcomes are believed to often shape their thinking so as to all but guarantee that they find a way to believe, decide, and justify whatever they like.””

    A very interesting post here. Jumping off that paragraph above, I am wondering how that might fit in with the current scientific consensus re: evolution that you mention. Do you know if there has been much theorizing about why persons seem wired to believe, decide, and justify whatever they like when it seems like this should have been selected against? Doesn’t it make sense to think that knowing about reality might more amenable to survival in our environments?

    Off the topic a bit I know, but its hard to not leap off onto these bunny trails when your article synthesizes so many big ideas.


    • Nathan, I’m not aware of any current stuff, or at least haven’t run across any, but there are possible answers. Partly, it depends on what we mean by reality, and partly on evolving to deal with it. For example, while humans are very good at identifying and assessing immediate threats, they’re generally bad at identifying and assessing long-term threats. It’s a lot easier to decide someone is running out of the shadows to attack you than to understand long term threats to housing markets or the environment. So it’s possible that contemporary humans are creating a world they’re not evolved enough to understand.

      There’s also the question of reality. Assessing a threat or interpreting the results of research accurate are specific parts of reality that can be isolated and evaluated practically. However, the stuff in the paragraph you quote is more about motivating action and sustaining a high opinion of ourselves, and for that an understanding of reality isn’t what most people want. They would rather have a comforting illusion than an uncomfortable reality, because the comforting illusion keeps them going.

      Religion functions like this. Regardless of the preferred status you give to your own religion, the metaphysical claims of the world religions often conflict, so that while they might all be wrong about the world they can’t possibly all be right, and thus the majority of the world’s population are living under false ideas of reality. Also, many religious claims can’t be proved rationally, which is why people believe them on faith. But what can be proven scientifically, and indeed has been, is the emotional and psychological benefit people gain individually from religious faith, whether it’s the feeling of in-group belonging or sense of superiority that drives people, or the effect of religious faith on the outlooks of cancer patients. I’ve also read that depressed and suicidal people often have a very good understanding of the truth of their situations.

      Telling stories about the world that make us feel better and that give us a sense of belonging seems to be the way humans work, regardless of the ultimate truths of the claims we might make. So maybe it’s the details we have to get right, while the big picture is only important if it motivates us to action or gives us comfort and confidence.

  5. Wayne,

    As always, I appreciate your thoughtfulness. What you say here makes some sense – to a point, I think.

    It seems we can readily agree with most any other human being that knowing some basic facts or “truths” (little t) on the ground might have some obvious, immediate survival value, for instance, when we both immediately respond to the sight of the hungry tiger and run away.

    But here is the real question, I think: why would our evolved (and evolving) reason and sense “equipment” be useful for anything more complicated and abstract than this – and if it seems to be, why should we trust it? Why, for instance, would paleontologists who postulate that a bone with fresh dino blood and vessels is 65 million years old based on their understanding of radiometric dating methods, the geological column, taxonomy, and sequences of “index” fossils be more readily favored and selected by the evolutionary process over the practical geologists who learn to efficiently mine and refine iron, making weapons of war? (let’s assume these geologists aren’t barbarians and also have great social skills – perhaps because of a little bit of that tendency you mention to believe false things about themselves, giving them more confidence – which no doubt, are as valuable if not more valuable than tool making).

    Is there any way to definitively prove that we, in our scientific explanations, are capable of producing complicated theories and models that even begin to be accurate representations of reality – precisely since many of these explanations do not have any obvious, immediate survival value?

    It seems to me that evolutionary thinking can’t help but undercut the value of the concept of truth – and the branch that itself sits on. Ironically, it would seem to only be a theistic view of the creation (which includes God’s endowing us with reliable powers of sense and reason, or our “epistemic equipment”) that would give us reason for having confidence in our theories or models as “maps” that help us get closer to the Truth (i.e. the big picture) “out there”.

    Or not?


    • Nathan, I think you’re reaching here. I see no argument here that the theory of evolution is undermining a belief in truth or undercutting itself in any way. And the remarkable progress of science, medicine, and technology since the 18th century demonstrates the scientific worldview has a better understanding of the “real world” than any prescientific beliefs do, and in openly testable ways. You’re trying to make a theological argument, but there’s no point with me because I don’t share your theological premises.

  6. Wayne,

    I understand you think I’m reaching. To my knowledge the points I’m making have been made by non-theologians as well – I actually recall hearing about these points – or points like them – being made in a philosophy journal by a non-theist. If we are not designed to know Truth why should we assume that we can know it? We were “designed” to survive, and here being deceived by our senses (not only as regards our views of ourselves) may be just as useful as being able to get a totally accurate map. It seems that what works is really what matters.


    • Nathan, if that’s what you were arguing, perhaps I misunderstood. Regarding “what works is really what matters,” that’s defended explicitly by philosophers such as Nietzsche or the pragmatists, and is in a sense the underlying assumption of science. As for our human understanding, it’s only religion and premodern philosophies that claim to have Truth with a capital T about the world, especially Truth that can’t be established by evidence. Science is much more epistemologically modest.

  7. Wayne,

    Hey thanks for the engagement. I’ll just stop now before I get to be too obnoxious… : )


  8. Jordan Hunt says:

    Great article. I often feel frustrated when librarians and library advocates talk about information literacy as if its a science. Its not as if we finally settle the nation’s political division through information literacy, as some writers of articles appearing in Library Journal and elsewhere seem to imply. Evaluating sources critically is more subjective than most people care to admit.

  9. Jordan Hunt says:

    Though I have some questions. Is information literacy merely unnatural, or truly impossible as defined by the ACRL? If that definition is impossible, is there another approach or conception that is preferable?

    • I haven’t completely thought this through, but I want to say it’s unnatural, not impossible, at least for focused projects. That it’s unnatural says nothing against it other than that learning it will be more of a struggle than learning to walk or speak. Our minds don’t want to function objectively at all. On the other hand, by understanding our biases and compensating for them in given projects, I think IL is achievable in a limited sense. Once we extend IL or critical thinking or something like that to our entire lives, I doubt it’s possible. Too many smart, critical people believe too many dumb things. I’m sure I do, too, I just can’t recognize which are the dumb things because of my inherent biases.