Librarians tend to view information literacy in light of the ACRL Information Literacy Competency Standards. Information literacy is a set of competencies, a set of things we should be able to do. If you’re information literate, you should be able to, among other things:
- Determine the extent of information needed
- Access the needed information effectively and efficiently
- Evaluate information and its sources critically
- Incorporate selected information into one’s knowledge base
These all sound like good, sensible outcomes for a sounder higher education. However, one of the many problems with becoming information literate in any robust sense is that it’s completely unnatural. The entire enterprise goes against the way the human mind tends to gather and use information. Human beings are animals perhaps capable of information literacy but apparently designed to work in other ways.
You probably didn’t balk at that last sentence at all. Human beings are designed to work in other ways. You might think human beings are “designed.” But did you come to that belief through a neutral and critical evaluation of the available evidence or through some other route? If you approached and evaluated the evidence for such design critically, you might be much less sure of that belief and conclude, along with the scientific consensus, that humans are the product of evolution and not design.
However, regardless of whether it is “designed” or not, “the human mind is highly prone to detecting agency,” according to an article in the recent book The Philosophy of Pseudoscience, “and it often does so even in the absence of agents.” We might think a person is nearby even if it’s only the wind. From an evolutionary perspective, in our history “it is far less costly to assume that there is an agent when there is none than the other way around.” After all, if it’s the wind and we’re temporarily wary it’s a person behind us, that’s better than if we ignore the sound and get attacked. We think we’re designed because we want to attribute everything to a designing agent. It’s just natural.
Even if you don’t think we’re designed, you’re probably comfortable with that kind of language, and language shapes our thought. In this case, the language is in line with what Robert McCauley calls “maturational naturalness,” the sort of thinking that comes naturally just because we grow up. We’re comfortable with attributing agency even where it doesn’t exist, because that’s just the way we work. Some skills, like information literacy, can be the result of practiced naturalness, if in fact people practice them.
The literature on pseudoscience is rife with examples of flawed reasoning, but the flawed reasoning is a result of our natural thought patterns. Scientific reasoning and critical thinking, the motors behind information literacy and the academic enterprise, are learned and rare, which is why most of us reason poorly much of the time and all of us reason poorly some of the time.
Aristotle wrote that humans by nature desire to know, but that doesn’t seem to be true. Psychologists studying how the mind functions might say instead that humans by nature desire to interpret the world in a way that makes sense to them and makes them feel good about themselves, regardless of the facts. It’s called motivated thinking, and one review of the literature claims that “individuals’ preferences for certain outcomes are believed to often shape their thinking so as to all but guarantee that they find a way to believe, decide, and justify whatever they like.” Consider that in relation to the task of teaching information literacy.
Such thinking is especially prevalent around issues of great importance to us, such as politics or religion. In every area of belief, we want to be right, or at least considered right, but when it comes to beliefs core to our definitions of ourselves, we are highly resistant to alternative beliefs. People are more likely to evaluate positively information that confirms their beliefs and spend more time criticizing information that challenges them. Not only that, but some studies show that when confronted with strong evidence that their beliefs are mistaken, people tend to hold those beliefs even more strongly. It doesn’t matter how rigorous or scientific the information is. What matters most is their previous beliefs and how the new information makes them feel about themselves.
There are various names for these flaws in critical thinking. A study on “motivated skepticism in the evaluation of political beliefs” focused on the following: motivated skepticism, confirmation bias, disconfirmation bias, prior attitude effect, selective exposure, attitude polarization, and cognitive dissonance. We naturally do everything we can to avoid changing our minds and everything we can to make ourselves look better.
However, these behaviors don’t just apply to politics or religion. A recent study from Finland on “core knowledge confusions among university students” found that even university-educated students had trouble “in fully differentiating the core ontology of physical, biological, and mental phenomena.” For example, “children may construe almost anything as animate,” as if the moon were a living being because it “moves” across the sky. Adults aren’t necessarily that much better, even educated ones. Students were given 30 statements such as “plants know the seasons” or “furniture wants a home.” “Half of the participants considered at least four, and one quarter of the participants considered eight to 30 statements to be literally true.” That’s literally literally, not figuratively literally, as in such common statements as “that movie literally scared me to death.”
In addition, consider Daniel Kahneman’s and others’ work on slow thinking and fast thinking, showing that quick intuitive thinking comes very naturally to us and is often inferior when considering anything that requires more complex thought. However, the slower, more complex thought is difficult and indeed physically draining.
The natural working of the human mind explains why so many people believe in astrology, crystal healing, or homeopathy, despite the lack of evidence that they work. Even trained scientists and academics aren’t immune to these problems and are often guilty of confirmation bias or the use of selective evidence.
Information literacy in a strong sense is deeply unnatural, and yet we task ourselves with teaching it. Sometimes we might feel bad for not accomplishing more, but given the workings of the human mind, when it comes to teaching information literacy, it’s amazing if we accomplish anything at all.