(Atharva Tulsi, Unsplash, https://unsplash.com/photos/RVpCAtjhyuA)
For almost 20 years, instruction librarians have relied on variations of two models, the CRAAP Test and SIFT, to teach students how to evaluate printed and web-based materials. Dramatic changes to the information ecosystem, however, present new challenges amid a flood of misinformation where algorithms lie beneath the surface of popular and library platforms collecting clicks and shaping content. When applied to increasingly connected networks, these existing evaluation heuristics have limited value. Drawing on our combined experience at community colleges and universities in the U.S. and Canada, and with Project Information Literacy (PIL), a national research institute studying college students’ information practices for the past decade, this paper presents a new evaluative approach for teaching students to see information as the agent, rather than themselves. Opportunities and strategies are identified for evaluating the veracity of sources, first as students, leveraging the expertise they bring with them into the classroom, and then as lifelong learners in search of information they can trust and rely on.
Arriving at deeply considered answers to important questions is an increasingly difficult task. It often requires time, effort, discernment, and a willingness to dig below the surface of Google-ready answers. Careful investigation of content is needed more than ever in a world where information is in limitless supply but often tainted by misinformation, while insidious algorithms track and shape content that users see on their screens. Teaching college students evaluative strategies essential for academic success and in their daily lives is one of the greatest challenges of information literacy instruction today.
In the last decade, information evaluation — the ability to ferret out the reliability, validity, or accuracy of sources — has changed substantively in both teaching practice and meaning. The halcyon days of teaching students the CRAAP Test1, a handy checklist for determining the credibility of digital resources, are over2; and, in many cases, SIFT3, another reputation heuristic, is now in use on numerous campuses. At the same time, evaluative strategies have become more nuanced and complex as librarians continue to debate how to best teach these critically important skills in changing times.4
In this article, we introduce the idea of proactivity as an approach that instruction librarians can use for re-imagining evaluation. We explore new ways of encouraging students to question how information works, how information finds them, and how they can draw on their own strengths and experiences to develop skills for determining credibility, usefulness, and trust of sources in response to an information ecosystem rife with deception and misinformation. Ultimately, we discuss how a proactive approach empowers students to become experts in their own right as they search for reliable information they can trust.
2. A short history of two models for teaching evaluation
Mention “information literacy instruction” and most academic librarians and faculty think of evaluation frameworks or heuristics that have been used and adapted for nearly two decades. The most widely known are the CRAAP method, and more recently, SIFT, both designed to determine the validity and reliability of claims and sources.
CRAAP debuted in 20045 when several academic librarians developed an easy to use assessment framework for helping students and instructors evaluate information for academic papers. CRAAP, a catchy acronym for Currency, Relevancy, Accuracy, Authority, Purpose, walks students through the criteria for assessing found content. For librarians, this approach to evaluation is a manifestation of the Information Literacy Competency Standards for Higher Education developed by the ACRL and especially an outcome of Standard 3.2: “Examines and compares information from various sources in order to evaluate reliability, validity, accuracy, authority, timeliness, and point of view or bias.”6
When the CRAAP method was first deployed nearly 20 years ago, the world was still making the transition from Web 1.0 to Web 2.0. Most online content was meant to be consumed, not interacted with, altered, changed, and shared. CRAAP was developed in a time when you found information, before the dramatic shift to information finding you. As monolithic players like Google and Facebook began using tracking software on their platforms in 2008 and selling access to this information in 2012, web evaluation became a very different process. In a role reversal, media and retail platforms, such as Amazon, had begun to evaluate their users to determine what information they should receive, rather than users evaluating what information they found.
Since 2015, criticism has mounted about the CRAAP test, despite its continued and widespread use on campuses. Checklists like CRAAP are meant to reduce cognitive overload, but they can actually increase it, leading students to make poor decisions about the credibility of sources, especially in densely interconnected networks.7 As one critic has summed it up: “CRAAP isn’t about critical thinking – it’s about oversimplified binaries.”8 We agree: CRAAP was designed for a fairly narrow range of situations, where students might have little background knowledge to assist in judging claims and often had to apply constraints of format, date, or other instructor-imposed requirements; but these bore little resemblance to everyday interactions with information, even then.
When Mike Caulfield published the SIFT model in 2019, it gave instruction librarians a progressive alternative to the CRAAP test. Caulfield described his evaluation methods as a “networked reputation heuristic,”9 developed in response to the spread of misinformation and disinformation in the post-truth era. The four “moves” he identified — Stop, Investigate, Find, Trace — are meant to help people recontextualize information through placing a particular work and its claims within the larger realm of content about a topic.
SIFT offers major improvements over CRAAP in speed, simplicity, and applicability to a wider scope of print and online publications, platforms, and purposes. Recently, researchers have identified the benefits of using this approach,10 and, in particular, the lateral reading strategies it incorporates. SIFT encourages students to base evaluation on cues that go beyond the intrinsic qualities of the article and to use comparisons across media sources to understand the trustworthiness of an article. This is what Justin Reich,11 Director of the MIT Teaching Systems Lab, noted in a 2020 Project Information Literacy (PIL) interview, calling SIFT a useful “first step,” since it may assist students in acquiring the background knowledge they need to evaluate the next piece of information they encounter on the topic.
Crucially, SIFT also includes the context of the information needed as part of evaluation – some situations require a higher level of verification than others. The actions SIFT recommends are more closely aligned with the kind of checking students are already using to detect bias12 and decide what to believe and how researchers themselves judge the quality of information.13 And while it is much better suited to today’s context, where misinformation abounds and algorithms proliferate, SIFT is still based on students encountering individual information objects, without necessarily understanding them as part of a system.
Our proposed next step, what we call proactive evaluation, would allow them not only to evaluate what they’re seeing but consider why they’re seeing what they do and what might be missing. SIFT, like CRAAP, is based on a reactive approach: the individual is an agent, acting upon information objects they find. In today’s information landscape, we think it is more useful to invert this relationship and consider the information object as the agent that is acting on the individual it finds.
3. Information with agency
Thinking of information as having agency allows us to re-examine the information environment we think we know. By the time they get to college, today’s students are embedded in the information infrastructure: a social phenomenon of interconnected sources, creators, processes, filters, stories, formats, platforms, motivations, channels, and audiences. Their profiles and behaviors affect not only what they see and share but also the relative prominence of stories, images, and articles in others’ feeds and search results. Information enters, flows through, and ricochets around the systems they inhabit – fueled, funded, and filtered by data gathered from every interaction.
Research from PIL,14 and elsewhere,15 indicates that students who see algorithmic personalization at work in their everyday information activities already perceive information as having agency, specifically, the ability to find them, follow them across platforms, and keep them in filter bubbles. They understand the bargain they are required to make with corporations like Amazon, Alphabet, and Facebook where they exchange personal data for participation in communities, transactions, or search efficiency.
When PIL interviewed 103 undergraduates at eight U.S. colleges and universities in 2019 for the algorithm study, one student at a liberal arts college described worries we heard from others about the broader social impact of these systems: “I’m more concerned about the large-scale trend of predicting what we want, but then also predicting what we want in ways that push a lot of people towards the same cultural and political endpoint.”16
This student’s concern relates to the effects of algorithmic personalization and highlights student awareness of deliberate efforts to affect and, in many cases, infect the system.17 Subverting the flow of information for fun and profit has become all too common practice for trolls, governments, corporations, and other interest groups.18 The tactics we’ve taught students for evaluating items one at a time provide slim defenses against the networked efforts of organizations that flood feeds, timelines, and search results. While SIFT at least considers information as part of an ecosystem, we still need to help students go beyond evaluating individual information objects and understand the systems that intervene during the search processes, sending results with the agency to nudge, if not shove, users in certain directions.
That is why it is time to consider a new approach to the teaching of source evaluation in order to keep up with the volatile information ecosystem. Allowing for information to have agency, i.e. acknowledging information as active, targeted, and capable of influencing action, fundamentally alters the position of the student in the act of evaluation and demands a different approach from instruction librarians. We call this approach proactive evaluation.
4. Proactive evaluation
What happens if we shift our paradigm from assuming that students are agents in the information-student interaction to assuming that the information source is the agent? This change in perspective will dramatically reframe our instruction in important ways. This perspective may initially seem to disempower the information literacy student and instructor, but given widespread disinformation in this post-truth era, this reversal might keep us, as instructors, grounded in our understanding of information literacy.
Once we shift the understanding of who is acting upon whom, we can shift our approaches and techniques to reflect this perspective. This change in thinking allows us to move from reactive evaluation, that is, “Here is what I found, what do I think of it?” to proactive evaluation, “Because I understand where this information came from and why I’m seeing it, I can trust it for this kind of information, and for this purpose.”
What does a proactive approach look like? Table 1 presents comparisons between reactive and proactive approaches to information literacy as a starting point for thinking about this shift in thinking. This typology acknowledges that college and university students come into our classrooms with a deep and wide knowledge of the information landscapes in which they exist.
A Model for Transitioning from Reactive to Proactive Evaluation
|Understanding of information|
|Individual objects you find||→||Networked objects that find you|
|Understanding of evaluation|
|Intrinsic (to the object)||→||Contextual (within the network)|
|User is the agent||Information is the agent→||Both the user and the information have agency in a dynamic relationship|
|How/what we teach|
|Closed yes/no questions with defined answers||→||Open questions|
|Binaries (good/bad, scholarly/popular)||→||Contextual continua (useful for topic x in circumstance y if complemented by z)|
|Student as perpetual novice (evaluates from scratch every time)||→||Student as developing expert with existing knowledge, who brings expertise about information, subject, sources, processes|
|Evaluate individual objects with novice tools and surface heuristics||→||Evaluate based on network context and connections, and build networks of trusted knowledge/sources|
|CRAAP||SIFT→||Into the unknown|
As this typology suggests, our thinking rejects the “banking model of education” where students are empty vessels that educators must fill.19 To illustrate this point, PIL’s 2020 algorithm study has confirmed what we have long suspected: many students are already using evasive strategies to circumvent algorithmic tracking and bias. Their tactics, learned from friends and family, not their instructors, range from creating throwaway email accounts to using VPNs and ad-blocking apps to guard their personal data from algorithms.20
Students know that information is constantly trying to find them, identify them, label them, and sway them. And they may know this better than the faculty that teach them.21 Applying this to information literacy instruction means acknowledging that students approach information skeptically, and at least some students arrive in the classroom with defensive practices for safeguarding their privacy and mitigating invasive, biased information as they navigate the web and search for information.
To build on this premise, we should be asking students to apply their defensive strategies to classroom-based tasks. “If this information showed up in your news stream, what tactics would you use to decide if you wanted to pass it along?” “What do you look for to know if this is valid or useful information?” “Instead of asking yes/no questions, e.g., ‘Is it written by an expert?’” “Is it current?” In particular, we should shift our assessment questions to an open-ended inquiry with students.
An example of how this could work would be asking the class what they do when they encounter a new piece of information in their own information landscape, such as a news story. How would students go about deciding if they would reshare it? What are their motivations for sharing a news story? In PIL’s news study, for instance, more than half of the almost 6,000 students surveyed (52%) said their reason for sharing news on social media was to let friends and followers know about something they should be aware of, while more than two fifths (44%) said sharing news gives them a voice about a larger political or social cause.22 Does the same drive hold true for students in this classroom example?
For librarians using a proactive approach like this one, they could have a classroom discussion to see if their students also see themselves as stewards of what is important to know, while having a voice about larger causes in the world. A proactive approach also allows students to bring their prior networked knowledge into the discussion, rather than looking at a single point of information in isolation when directed by an instruction librarian. Asking students to make their tacit processes more explicit will also help them see the information networks they have already built more clearly. They may be using other factors in their decision-making, like who recommended a source or the context in which the information will be used. These evaluation points are also used by researchers when assessing the credibility of information.23 Providing opportunities for students to reflect on and articulate their interactions with information in the subject areas where they feel confident may allow them to transfer skills more easily to new, less familiar, academic domains.
Students sharing these kinds of spontaneous reflections can leverage the social aspect of information skills. PIL studies have shown repeatedly that students lean on each other when they evaluate content for academic, employment, and everyday purposes; when necessary they also look to experts, including their instructors, to suggest or validate resources. Evaluation is fundamentally a social practice, but the existing heuristics don’t approach it this way. Reliance on other people as part of trusted information networks is rarely even acknowledged, let alone explicitly taught in formal instruction, as we tend to focus on the stereotype of the solitary scholar.
Gaining understanding of their own information networks, students can learn to see the operations of other networks, within disciplines, news, and other commercial media. If they are aware of the interconnectedness of information, they can use those connections to evaluate content and develop their mental Rolodexes of trusted sources.24 Understanding which sources are trustworthy for which kinds of information in which contexts is foundational knowledge for both academic work and civic engagement.
Building on SIFT strategies, it’s possible for students to accumulate knowledge about sources by validating them with tools like Wikipedia. Comparing and corroborating may illuminate the impact of algorithms and other systems that make up the information infrastructure.25 Developing this kind of map of their network of trusted sources can help them search and verify more strategically within that network, whether they’re in school or not.
As they come to understand themselves as part of the information infrastructure, students may be able to reclaim some agency from the platforms that constrain and control the information they see. While they may not ever be able to fully escape mass personalization, looking more closely at its effects may increase awareness of when and how search results and news feeds are being manipulated. Students need to understand why they see the information that streams at them, the news that comes into their social media feeds, the results that show up at the top of a search, and what they can do to balance out the agency equation and regain some control.
Admittedly, this form of instruction is clearly more difficult to implement than turnkey checklists and frameworks. It is much harder to fit into the precious time of a one-shot. It requires trust in the students, trust in their prior knowledge, and trust in their sense-making skills. This change in perspective about how we teach evaluation is not a magic bullet for fixing our flawed instruction practices. But we see proactive evaluation as an important step for moving our profession forward in teaching students how to navigate an ever-changing information landscape. This proactive model can be used in conjunction with, or independent of, SIFT to create a more complex information literacy.
Reactive evaluation considers found information objects in isolation, based on intrinsic qualities, regardless of the user or intended use. In a proactive approach, the user considers the source while evaluating information contextually, through its relationships to other sources and to the user. Over time, a user can construct their own matrix of trusted sources. It’s similar to getting to know a new city; a newcomer’s mental map gradually develops overlays of shortcuts, the safe and not-so-safe zones, and the likely places to find what they need in a given situation. Eventually, they learn where to go for what, a critical thinking skill they can take with them through the rest of their education and everyday lives and apply with confidence long after graduation.
5. Into the unknown
Reactive approaches to evaluation are not sufficient to equip students to navigate the current and evolving information landscape. What we have proposed in this paper is an alternative, what we call a proactive approach, to information evaluation that moves away from finite and simple source evaluation questions to open-ended and networked questions. While a proactive approach may feel unfamiliar and overwhelming at first, it moves away from the known to the unknown to create a more information-literate generation of students and lifelong learners.
But what if this approach is actually not as unfamiliar as it may seem? The current ACRL framework paints a picture of the “information-literate student” that speaks to a pedagogy that cultivates a complex and nuanced understanding of the information creation process and landscape. For example, in the “Scholarship as Conversation” frame, these dispositions include “recognize that scholarly conversations take place in various venues,” and “value user-generated content and evaluate contributions made by others.”26
Both dispositions require a nuanced understanding of the socialness of scholarship and imply evaluation within a social context. And while heuristics that rely on finite and binary responses are easy to teach, they create more problems than they solve. Focusing on the network processes that deliver the information in front of us, instead of focusing on these finite questions, allows for a different kind of knowing.
The next question for instructors to tackle is what this proactive approach looks like in the classroom. In our field, discussions of “guide on the side” and “sage on the stage” are popular, but what we are actually advocating in this article isn’t a guide or a sage, as both assume a power structure and expertise that is incomplete and outdated. In the classroom, we advocate a shift from guiding or lecturing to conversation. We do not have a set of desired answers that we are hoping to coax out of the students: Which of these sources is valid? Who authored this source, and are they an expert? Rather, a proactive approach encourages students to engage and interact with their ideas and previous experiences around information agency, the socialness of the information, and how they evaluate non-academic sources. This will allow students to bring their deep expertise into the classroom.
We have alluded to open-ended questions as part of the proactive approach, but this is more accurately described as an open dialogue. This type of instruction is difficult in the one-shot structure, as it relies on trust. An unsuccessful session looks like your worst instruction experience, with the students staring blankly at you and not engaging, leaving lots of empty space and the strong desire to revert to lecturing on database structures. A successful session will feel like an intellectual conversation where you as the “teacher” learn as much as you impart, and the conversation with students is free-flowing and engaging.
Returning to the earlier example of asking how a student would chose whether or not to reshare a news story, this type of dialogue could include conversations about what they already know about the news source, what they know about the person or account that initially shared it, how they might go about reading laterally, what their instincts say, how this does or does not fit with their prior knowledge on the subject, and their related reactions. During the course of the discussion, it will lead to what areas of information literacy and assessment need more dialogue and what areas the students are already skilled and comfortable in.
The kind of information literacy instruction that assumes agency rests solely with the user, who finds and then evaluates individual information objects, is no longer valid now that information seeks out the user through networked connections. This reversal of the power dynamic underlies many of the gaps between how evaluation is taught in academic settings and how it occurs in everyday life. The approach we advocate balances out these extremes and helps students recognize and regain some of their agency. By understanding how information infrastructures work and their roles within them, students can adapt the tactics that many of them are already using to become more conscious actors.
6. Looking Ahead
In this article, we have discussed an alternative to current evaluation approaches that is closely tied to the issue of trust: trusting our students to bring their own experiences and expertise to the information literacy classroom. But our work doesn’t end there. Our approach also requires us to trust ourselves as instructors. We will need to trust that we do in fact understand the continuously changing information landscape well enough to engage with open-ended, complex questions, rather than a prescribed step-by-step model. We must continue to inform ourselves and reevaluate information systems — the architectures, infrastructures, and fundamental belief systems — so we can determine what is trustworthy. We have to let go of simple solutions to teach about researching complex, messy problems.
For college students in America today, knowing how to evaluate news and information is not only essential for academic success but urgently needed for making sound choices during tumultuous times. We must embrace that instruction, and information evaluation, are going to be ugly, hard, and confusing for us to tackle but worth it in the end to remain relevant and useful to the students we teach.
We are grateful to Barbara Fister, Contributing Editor of the “PIL Provocation Series” at Project Information Literacy (PIL) for making incisive suggestions for improving this paper, and Steven Braun, Senior Researcher in Information Design at PIL, for designing Table 1. The article has greatly benefited from the reviewers assigned by In the Library with the Lead Pipe: Ian Beilin, Ikumi Crocoll, and Jessica Kiebler.
“Framework for Information Literacy for Higher Education.” 2016. Association of College and Research Libraries. January 16. https://www.ala.org/acrl/standards/ilframework.
“Information Literacy Competency Standards for Higher Education.” 2000. Association of College and Research Libraries. January 18. http://www.acrl.org/ ala/mgrps/divs/acrl/standards/standards.pdf.
Bengani, Priyanjana. “As Election Looms, a Network of Mysterious ‘Pink Slime’ Local News Outlets Nearly Triples in Size.” Columbia Journalism Review, August 4, 2020. https://www.cjr.org/analysis/as-election-looms-a-network-of-mysterious-pink-slime-local-news-outlets-nearly-triples-in-size.php.
Blakeslee, Sarah. “The CRAAP Test.” LOEX Quarterly 31, no. 3 (2004). https://commons.emich.edu/loexquarterly/vol31/iss3/4.
Breakstone, Joel, Mark Smith, Priscilla Connors, Teresa Ortega, Darby Kerr, and Sam Wineburg. “Lateral Reading: College Students Learn to Critically Evaluate Internet Sources in an Online Course.” The Harvard Kennedy School Misinformation Review 2, no. 1 (2021): 1–17. https://doi.org/10.37016/mr-2020-56.
Brodsky, Jessica E., Patricia J. Brooks, Donna Scimeca, Ralitsa Todorova, Peter Galati, Michael Batson, Robert Grosso, Michael Matthews, Victor Miller, and Michael Caulfield. “Improving College Students’ Fact-Checking Strategies through Lateral Reading Instruction in a General Education Civics Course.” Cognitive Research: Principles and Implications 6 (2021). https://doi.org/10.1186/s41235-021-00291-4.
Caulfield, Mike. “A Short History of CRAAP.” Blog. Hapgood (blog), September 14, 2018. https://hapgood.us/2018/09/14/a-short-history-of-craap/.
———. Truth is in the network. Email, May 31, 2019. https://projectinfolit.org/smart-talk-interviews/truth-is-in-the-network/.
———. Web Literacy for Student Fact-Checkers, 2017. https://webliteracy.pressbooks.com/.
Dubé, Jacob. “No Escape: The Neverending Online Threats to Female Journalists.” Ryerson Review of Journalism, no. Spring 2018 (May 28, 2018). https://rrj.ca/no-escape-the-neverending-online-threats-to-female-journalists/.
Fister, Barbara. “The Information Literacy Standards/Framework Debate.” Inside Higher Ed, Library Babel Fish, January 22, 2015. https://www.insidehighered.com/blogs/library-babel-fish/information-literacy-standardsframework-debate.
Foster, Nancy Fried. “The Librarian-Student-Faculty Triangle: Conflicting Research Strategies?” Library Assessment Conference, 2010. https://urresearch.rochester.edu/researcherFileDownload.action?researcherFileId=71.
Freire, Paulo. “The Banking Model of Education.” In Critical Issues in Education: An Anthology of Readings, 105–17. Sage, 1970.
Haider, Jutta, and Olof Sundin. “Information Literacy Challenges in Digital Culture: Conflicting Engagements of Trust and Doubt.” Information, Communication and Society, 2020. https://doi.org/10.1080/1369118X.2020.1851389.
Head, Alison J., Barbara Fister, and Margy MacMillan. “Information Literacy in the Age of Algorithms.” Project Information Literacy Research Institute, January 15, 2020. https://projectinfolit.org/publications/algorithm-study.
Head, Alison J., John Wihbey, P. Takis Metaxas, Margy MacMillan, and Dan Cohen. “How Students Engage with News: Five Takeaways for Educators, Journalists, and Librarians.” Project Information Literacy Research Institute, October 16, 2018. https://projectinfolit.org/pubs/news-study/pil_news-study_2018-10-16.pdf.
Maass, Dave, Aaron Mackey, and Camille Fischer. “The Follies 2018.” Electronic Frontier Foundation, March 11, 2018. https://www.eff.org/deeplinks/2018/03/foilies-2018.
Meola, Marc. “Chucking the Checklist: A Contextual Approach to Teaching Undergraduates Web-Site Evaluation.” Libraries and the Academy 4, no. 3 (2004): 331–44. https://doi.org/10.1353/pla.2004.0055.
Reich, Justin. Tinkering Toward Networked Learning: What Tech Can and Can’t Do for Education. December 2020. https://projectinfolit.org/smart-talk-interviews/tinkering-toward-networked-learning-what-tech-can-and-cant-do-for-education/.
Seeber, Kevin. “Wiretaps and CRAAP.” Blog. Kevin Seeber (blog), March 18, 2017. http://kevinseeber.com/blog/wiretaps-and-craap/.
- The CRAAP Test (Currency, Relevance, Authority, Accuracy, Purpose) is a reliability heuristic designed by Sarah Blakeslee and her librarian colleagues at Chico State University. See: Sarah Blakeslee, “The CRAAP Test,” LOEX Quarterly 31 no. 3 (2004): https://commons.emich.edu/loexquarterly/vol31/iss3/4 [↩]
- Kevin Seeber, “Wiretaps and CRAAP,” Kevin Seeber [Blog], (March 18, 2017): http://kevinseeber.com/blog/wiretaps-and-craap/ [↩]
- Mike Caulfield, “The Truth is in the Network” [email interview by Barbara Fister], Project Information Literacy, Smart Talk Interview, no. 31, (December 1, 2020)https://projectinfolit.org/smart-talk-interviews/truth-is-in-the-network/ [↩]
- Barbara Fister, “The Information Literacy Standards/Framework Debate,” Library Babel Fish column, Inside Higher Education, (January 22, 2015): https://www.insidehighered.com/blogs/library-babel-fish/information-literacy-standardsframework-debate [↩]
- Sarah Blakeslee, “The CRAAP test,” op. cit. https://commons.emich.edu/loexquarterly/vol31/iss3/4 [↩]
- Association of College and Research Libraries, Information Literacy Competency Standards for Higher Education, (2000), https://alair.ala.org/handle/11213/7668 Note: These standards were rescinded in 2016. [↩]
- Mike Caulfield, “A Short History of CRAAP,” Hapgood, (June 14, 2018): https://hapgood.us/2018/09/14/a-short-history-of-craap/ [↩]
- Kevin Seeber (March 18, 2017), “Wiretaps and CRAAP,” op. cit. [↩]
- Mike Caulfield, “The Truth is in the Network,” op. cit. Caulfield developed SIFT from earlier version of this heuristic, “four moves and a habit,” described in his 2017 OER book Web Literacy for Student Fact-Checkers, (December 1, 2020) https://webliteracy.pressbooks.com [↩]
- Jessica E. Brodsky, Patricia J. Brooks, Donna Scimeca, Ralitsa Todorova, Peter Galati, Michael Batson, Robert Grosso, Michael Matthews, Victor Miller, and Michael Caulfield , “Improving College Students’ Fact-Checking Strategies Through Lateral Reading Instruction in a General Education Civics Course,” Cognitive Research: Principles and Implications, 6(1) (2021), 1-18, https://doi.org/10.1186/s41235-021-00291-4; Joel Breakstone, Mark Smith, Priscilla Connors, Teresa Ortega, Darby Kerr, and Sam Wineburg, “Lateral Reading: College Students Learn to Critically Evaluate Internet Sources in an Online Course,” The Harvard Kennedy School Misinformation Review, 2(1), (2021) 1-17, https://doi.org/10.37016/mr-2020-56 [↩]
- Justin Reich, “Tinkering Toward Networked Learning: What Tech Can and Can’t Do for Education” [email interview by Barbara Fister], Project Information Literacy, Smart Talk Interview, no. 33, (December 2020): https://projectinfolit.org/smart-talk-interviews/tinkering-toward-networked-learning-what-tech-can-and-cant-do-for-education/ [↩]
- Alison J. Head, John Wihbey, P. Takis Metaxas, Margy MacMillan, and Dan Cohen, How Students Engage with News: Five Takeaways for Educators, Journalists, and Librarians, Project Information Literacy Research Institute, (October 16, 2018), pp. 24-28, https://projectinfolit.org/pubs/news-study/pil_news-study_2018-10-16.pdf [↩]
- Nancy Fried Foster , “The Librarian‐Student‐Faculty Triangle: Conflicting Research Strategies?.” 2010 Library Assessment Conference,(2010): https://urresearch.rochester.edu/researcherFileDownload.action?researcherFileId=71 [↩]
- Alison J. Head, Barbara Fister, and Margy MacMillan, Information Literacy in the Age of Algorithms, Project Information Literacy Research Institute, (January 15, 2020):https://projectinfolit.org/publications/algorithm-study [↩]
- Jutta Haider and Olof Sundin (2020), “Information Literacy Challenges in Digital Culture: Conflicting Engagements of Trust and Doubt,” Information, Communication and Society, ahead-of-print, https://doi.org/10.1080/1369118X.2020.1851389 [↩]
- Alison J. Head, Barbara Fister, and Margy MacMillan (January 15, 2020), op. cit. [↩]
- Alison J. Head, Barbara Fister, and Margy MacMillan, (January 15, 2020), op. cit., 5-8. [↩]
- See for example, Dave Mass, Aaron Mackey, and Camille Fischer, “The Foilies, 2018,” Electronic Frontier Foundation,(March 11, 2018): https://www.eff.org/deeplinks/2018/03/foilies-2018; Jacob Dubé, “No Escape: The Neverending Online Threats to Female Journalists,” Ryerson Review of Journalism, (May 28, 2018): https://rrj.ca/no-escape-the-neverending-online-threats-to-female-journalists/; Priyanjana Bengani, “As Election Looms, a Network of Mysterious ‘Pink Slime’ Local News Outlets Nearly Triples in Size,” Columbia Journalism Review,(August 4, 2020): https://www.cjr.org/analysis/as-election-looms-a-network-of-mysterious-pink-slime-local-news-outlets-nearly-triples-in-size.php [↩]
- Paulo Freire, “The Banking Model of Education,” In Provenzo, Eugene F. (ed.). Critical Issues in Education: An Anthology of Readings, Sage, (1970), 105-117. [↩]
- Alison J. Head, Barbara Fister, and Margy MacMillan (January 15, 2020), op.cit., 16-19. [↩]
- Alison J. Head, Barbara Fister, and Margy MacMillan (January 15, 2020), op.cit., 22-25. [↩]
- Alison J. Head, John Wihbey, P. Takis Metaxas, Margy MacMillan, and Dan Cohen (October 16, 2018), op.cit., 20 [↩]
- Nancy Fried Foster (2010), op. cit. [↩]
- Barbara Fister, “Lizard People in the Libraries,” PIL Provocation Series, No. 1, Project Information Literacy Research Institute,(February 3, 2021): https://projectinfolit.org/pubs/provocation-series/essays/lizard-people-in-the-library.html [↩]
- Marc Meola , “Chucking the Checklist: A Contextual Approach to Teaching Undergraduates Web-site Evaluation,” portal: Libraries and the Academy 4, no.3 (2004): 331-344, https://doi.org/10.1353/pla.2004.0055, p.338 [↩]
- Association of College and Research Libraries , Framework for Information Literacy for Higher Education (2016) http://www.ala.org/acrl/standards/ilframework [↩]