Each year since 2008, librarians at Carleton College read samples of sophomore writing as part of the Information Literacy in Student Writing project. The data captured through this project combined with our experiences in consultations and instruction sessions give us a richer understanding of undergraduate information literacy habits. We highlight two challenges for novices: evaluating and selecting sources, and understanding the purpose and methods of integrating sources into written work. We discuss the evidence that leads us to these conclusions and the methods we use to promote student development in these priority areas.
Carleton College’s Reference & Instruction librarians have engaged in the Information Literacy in Student Writing project (ILSW) since 2008.1 Each year, our observations while reading hundreds of papers, the data captured through this project, and our experiences in consultations and instruction sessions give us a richer understanding of undergraduate information literacy habits. As this project has evolved over ten years, students’ research behaviors have changed, as have the methods by which students retrieve their sources and the pre-college experiences our students have had with research. In previous articles, we have examined how this project has helped our relationship with faculty members and how it has impacted our information literacy instruction.2 In July 2018, 10 years after our initial reading, faculty members, librarians, and academic staff came together to read papers written by a representative sample of our sophomores submitted as part of the campus-wide Sophomore Writing Portfolio.3 The conversations we had during our reading sessions, the statistical analyses done by our data consultant, and our extensive experiences with research instruction and individual consultations, highlight two priorities for information literacy development for novice researchers: evaluating sources, and incorporating evidence into written work. In this article we present the evidence that leads us to these conclusions, and we discuss our work with students to help them learn to evaluate sources and use evidence.
Our Information Literacy Setting
Carleton College is a highly-selective private, four-year, residential, liberal arts institution located in Northfield, Minnesota. The Gould Library at Carleton College serves 2,000 students and 210 full time faculty members.4 From 2015-2021, the student population has had the following demographic characteristics: 12% first generation, 10% international, 29% students of color (who identify as at least one ethnicity other than white), and 58% receiving financial aid. Reference and Instruction librarians offer reference service 49 hours per week and also teach information literacy concepts and research skills in library instruction and student research consultations. We average nearly 800 consultations per year, 160 instruction sessions (primarily one-shot instruction), and 2,000 reference questions per year during non-pandemic years.
While we have a strong Writing Across the Curriculum program,5 there is no required composition course where students learn source use consistently. Instead, the library and several other academic support units on campus share collaborative responsibility for course-integrated support for student development of some key information literacy skills, such as citing sources, academic integrity, creating annotated bibliographies, reading critically, and the like. Key collaborators for us are the Writing Across the Curriculum Program, the Quantitative Resource Center, the Writing Center, Academic Technologies, and more. Together we work to reveal the constructed and variable disciplinary conventions that students must be able to recognize and interpret and, in many cases, reproduce through their research products. We also help students learn to work around and push against these conventions as necessary. Studying the products of this distributed instructional model by reading the writing produced in classes spanning the curriculum gives us a holistic look at student capacities, strengths, and struggles that we could not find by assessing library interactions specifically. Studying cross-disciplinary student writing in this way also helps us develop shared goals and priorities, both within the library and with other staff and faculty.
In the summer of 2018, Carleton College generously funded an expansion of our regular ILSW reading to include faculty and academic staff as readers, and to hire a data consultant to help us with our analysis. We were able to bring together stakeholders from the various areas of campus that share leadership in teaching information literacy skills to students, including the director of the Writing Across the Curriculum Program, academic technologists, librarians, the director of the Learning and Teaching Center, and faculty from the arts, humanities, social sciences, and STEM fields. In all, five librarians, six faculty, and one academic technologist gathered for our norming session and to read an average of sixteen papers each for a total of 150 papers6 from a stratified random sample of sophomore students.7 Together, readers identified clear, statistically significant trends in student information literacy practices. These trends also dovetail with research done in the scholarship of Communication Theory.
The Difficulties of Using Evidence: Literature Review
It is probably not news to any librarian that students sometimes struggle to deploy information effectively in their writing. The entangled cultural maneuvers involved in effective source-based writing are more and more present in our professional thinking, in our literature, 8 and in the Framework for Information Literacy in Higher Education.9 Scholarship founded in Genre Theory and Communication Theory also shed light on the complications of learning to participate in scholarly communication. Academic genres of writing are embodiments of complex purposes and contexts. Any given utterance is “the product of the interaction between speakers and … the whole complex social situation in which the utterance emerges,”10 and “typified” utterances become genres packed with culturally encoded context, and expectation.11 The written genres that students are asked to produce, such as lab reports, position papers, research papers, and reaction papers are therefore encoded with whole constellations of socially constructed meaning. Unfortunately for novices and outsiders, however, these culturally encoded genres shape everything from subtle signals about where this writing sits in the scholarly conversation to how readers should interpret the claims presented, or even what counts as good evidence.12
Meanwhile, the most typical genre of academic writing that many students have read prior to college is the textbook, and the social context baked into that genre places the student into the role of “information receiver and learner,” or as psychologist Barbara K Hofer says, “passive receptors” of knowledge.13 What is contestable or not, what counts as evidence, and how authority works are all different in the context of information reception than in the context of information creation. Students may therefore think that their primary goal is to communicate facts rather than build new insight, or they may not understand how to draw upon community expectations of authority, evidence, and argument to further their rhetorical goals. In mimicking the formats of academic writing without understanding the culturally encoded motivations and affordances of those academic genres, students struggle to use evidence to communicate effectively in their writing.
This is not to denigrate replication and mimicry. One primary way that students begin to understand academic writing and disciplinary rhetoric is by mimicking what they read. From the field of education comes the term Threshold Concepts. “A threshold concept can be considered as akin to a portal, opening up a new and previously inaccessible way of thinking about something. It represents a transformed way of understanding, or interpreting, or viewing something without which the learner cannot progress.”14 Students are understood to function in a liminal state of mimicry until they’ve crossed the threshold into a new state of understanding.15 With the advent of the Association of College and Research Libraries’ Framework for Information Literacy in Higher Education,16 librarianship and the library literature has increasingly engaged with threshold concepts. Many of us (the authors very much included) remember well our college days when it felt far more manageable to recreate what critical thinking looked like in writing than it was to actually think critically about our materials. Many of us have great empathy for the feelings of inadequacy and outright fear that can come with assignments to create novel contributions to fields of study.
Evaluating and Selecting Sources
Librarians know that source evaluation is a difficult task, especially for novices in a field. There are check-list tools like the CRAAP test17 to help students learn basic source evaluation, and we know that students apply broad heuristics to the challenge of sifting through the millions of sources available to them.18 But since “authority is constructed and contextual,”19 it is no surprise that there is literature criticizing these simplified evaluation strategies.20 It is also no surprise that when we use our ILSW rubric to evaluate student writing, “Evaluation of Sources” is the area in which students struggle the most.
In the 2018 ILSW study, sophomore writers struggled to select high quality sources that matched their rhetorical goals. On our rubric’s four-point scale from 1 (Poor) to 4 (Strong), 12% of scores indicated “Poor” Evaluation of Sources, and only 8% of scores indicated “Strong” skills in this area (see Figure 1). This rubric category also received the highest percentage of 2s (Weaknesses Interfere) and the lowest percentage of 3s (Weaknesses Do Not Interfere) compared to the other rubric categories. In addition to assigning rubric scores, readers were able to indicate key patterns that they noted in the papers they read. Fully 24% of the papers were given the designation “Sources lack breadth or depth consistent with genre,” and 15% went so far as to note a pattern of “Inappropriate sources/evidence used to support claim” (see Figure 2). In the optional free-text comments submitted by readers, more than a third of the comments addressed some aspect of source evaluation. For example, one comment read, “Cited a Daily Kos article for info on the history of Dance in the US (and this [Daily Kos] article even pointed to a scholarly book on the topic that the student didn’t look at).” Weaker papers missed obvious avenues of source exploration or relied on secondary citations such as citing a New York Times article that mentions a research study rather than seeking out the original study, even when the original sources were readily available through more specialized search tools such as our library discovery system or disciplinary research databases. This points to a common misunderstanding that novice writers hold about the underlying goals of source selection, not always realizing the culturally constructed authority structures that they could use (or productively flout) to more effectively borrow authority from their sources.
We investigated statistical differences in our scores between native English and non-native English writers, as well as between different races, ethnicities, and genders. However, our ILSW sample did not reveal any statistically significant differences between these groups. We do not know whether this is because there were no differences or because our sample size was too small to accurately assess all demographic groups. Our 2015 Research Practices Survey that measures pre-college experience revealed greater differences between first generation students, international students, and students of color as compared with white students,21 but we did not observe such differences in the ILSW results.
These findings mirror our experiences in research consultations, where students express confusion about why particular research tasks are being asked of them, whether the sources they find fit their assignment requirements, and what kinds of sources are suited to different research topics or goals.The students’ work may be further complicated by a mismatch between their chosen topics and the source types that may be required by their assignments, or by misidentifying source types to begin with. For example, we often see students in research consultations who think they have found articles when they have in fact found book reviews or encyclopedia entries. This could be because databases don’t make this distinction clearly enough in their metadata or because the students don’t know what an article looks like compared with other similar genres. Even more fundamentally, students frequently assume that the primary goal for finding a source is to confirm that what the student plans to say is not new — that it is backed up by (or at least thought by) other people in the world. These assumed goals often do not match the professor’s goals22 of having students engage with literature in order to generate novel interpretations rather than simply report on what is already known.
The Difficulties of Evaluating and Selecting Sources
While these findings and experiences are sobering, they are not surprising. Not only are Sophomore students only half-way through their education, source evaluation is a nuanced and situation-dependent process, and the amount of information available to sort through is increasingly vast and entangled. At the same time, it becomes more difficult to distinguish between the various types of sources, especially online sources. Every type of online source looks like a “website” or a PDF even when it may actually be anything from a blog post to a book review to a peer reviewed article to a full monograph. This phenomenon has been described as “container collapse.”23 Coupled with our students’ reduced high school experience with research and with libraries,24 container collapse leaves students increasingly confused by source evaluation.
The problem does not just lie in the fact that students do not have much experience with using physical resources, or the fact that many sources now do not have a physical counterpart. Multiple studies have shown that online sources are difficult to classify in general. In fact, two separate studies in 2016 found that there was no distinction between student level, age, or experience when it came to identifying online source types.25 Instruction may improve performance,26 but the major finding is that online publications are difficult for people to classify correctly, even into broad categories like “academic journal” or “book review.”
It may seem like a relic of a past era to think about publication types as an important aspect of source evaluation,27 but distinguishing between source genres is fundamental to the evaluation process. Source genres “identify a repertoire of actions that may be taken in a set of circumstances,” and they “identify the possible intentions” of their authors.28 Novice writers and researchers are therefore doubly hampered, first by not knowing which source genres are appropriate for various rhetorical tasks, and then by not being able to identify which genre of source they see in their browser windows.
Of course, evaluation doesn’t stop once appropriate source types are in hand. Students then have to navigate disciplinary conventions, subtle “credibility cues,”29 and webs of constructed authority, all of which is in addition to the basics of finding sources that speak about their topics in ways that seem informative, relevant, and understandable.30 For such a daunting set of tasks, all within tight term or semester time constraints, it’s no wonder that some students falter.
Supporting Student Development in Selecting Sources
For reference and instruction librarians supporting undergraduates, a foundational part of our work has always involved helping students develop the knowledge and skills needed for good source evaluation. Our experiences and ILSW findings emphasize that this core work of librarianship is vitally important. While librarians may not be as knowledgeable about specialized disciplinary discourse, we are uniquely positioned to help students recognize and navigate disciplinary conventions,31 and there is also evidence that library instruction can improve students’ ability to recognize online source types.32
Librarians help novice researchers develop their understanding for how to recognize source types through curated lists such as bibliographies, handbooks, research guides, and other resources that are created by experts rather than by algorithms. In an instruction session or research consultation, it takes very little time to show students that, in general, bibliography entries that list a place and/or publisher are books or chapters in books while the other entries are in some other kind of publication (journal, website, etc). Librarians can then point out that bibliographies are more than just alphabetical lists of relatively random works cited in a text — that they are instead maps of scholarly conversations, gathering together (ideally) the most relevant and important sources related to the text at hand. Students can then be encouraged to take notes on the keywords in bibliography entry titles, the journals that publish works related to the topic, prominent authors, key publishers, publication date distributions, and the like to develop a more nuanced sense of the kinds of sources that could be related to their topics. Each entry in a bibliography is a potential source in itself, but it also points to pockets of related sources for students to explore.
On our campus, librarians find that they can provide some very practical but crucial advice for undergraduates by introducing and explaining less-understood source types and also by helping students develop research strategies that use the various source genres to their full advantage. For example, one of the first things some liaison librarians talk about in research appointments is the importance of using scholarly reference sources and even Wikipedia to build context and gain a foothold in a new research area.33 It can seem inefficient to spend time reading a source that won’t be acceptable in a bibliography, since scholarly convention often discourages citing reference sources in academic papers. Because of this, we often see students skip this step entirely and dive right into an argument without much knowledge about the subject they are trying to discuss. Whether licensed or freely available, reference sources provide important factual contexts, define core vocabulary, and point to major voices in the conversation around the topic at hand. Reference sources can also signal what kinds of other sources count as good evidence in this conversation, and where to find them.
Crucially, going through the step of seeking out background sources, as Joseph Bizup terms them,34 will result in a better understanding of the topic at hand, which allows students to ask increasingly complex questions of their topic and make better use of analytical sources. Reading, rather than being a process that happens separately and after finding and accessing information, is an integral part of both “rhetorical invention”35 and also information literacy. Searching and even browsing may feel more active and efficient to the novice researcher, but good source discovery, evaluation, and selection all require active reading. And active reading in turn requires knowing how to spot and interpret the moves that authors make when positioning themselves against the backdrop of prior information, the language of the field (which will be useful for future searches), and the credibility cues that authors use when introducing outside sources into their writing. Building this context is one of the most crucial early steps in the research process.
Organizing information sources is also critical to source selection. For example, at Carleton we often introduce students to bibliographic managers such as Zotero or EndNote not only as citation generators, but as tools that help researchers think critically about their sources. We emphasize the practical aspects of these systems, but we also use them to teach students about the importance of citation tracking, tagging, and sorting, and how these practices allow researchers to see how sources are related to each other. We talk about the importance of organizing your own research and using a citation management system to identify prominent scholars, figure out which authors or experts are missing or left out, and even select source types if that is a requirement of a particular assignment.
Using Sources in Research and Writing
In our ILSW study, the Use of Evidence category on the rubric measures how well students synthesize, contextualize, and incorporate evidence into their writing. In 2018, this category of information literacy skill gave sophomore students almost as much trouble as the Evaluation of Sources category, with only 8% of papers given a “Strong” score of 1, and 12% given a “Poor” score of 4 (see Figure 3). Like with source evaluation, we expect Carleton sophomores to find these skills challenging, and our study’s findings reinforced these expectations. In addition, 29% of papers received the designation “Sources not integrated or synthesized,” usually indicating that students ceded control of their arguments to excessive quotation, summary, or reporting rather than calling on sources as rhetorical tools that advance the paper’s goals. Readers noted in the free-text comments such patterns as “Appears to cherry pick from those sources, most of which probably would have been great sources if used better,” or “There are a lot of opinions without much substantiation.” These weaker papers revealed confusion about the reason for drawing on evidence in the first place — not seeing the importance of interplay between source material and the student’s own thoughts.36 Stronger papers, on the other hand, integrated evidence in service of the students’ rhetorical goals, and the students framed and contextualized this evidence such that it helped the reader understand and trust the paper’s claims.
Students were often successful when attributing evidence in their written work, generally providing information that helped their readers understand the origins of the evidence and ideas they incorporated. Only 14% of papers exhibited “Egregious errors in bibliography, in-text citations, or notes,” and this rubric category received the second highest number of 3s and 4s after Strategic Inquiry. On the other hand, “Under-cited/supported claims” was the most common pattern noted among papers, appearing on the scoring sheets 51% of the time, and nearly 48% of the optional free text comments submitted by readers pointed out misunderstandings about attributive practices. This suggests that students often attribute uncritically, not realizing that attribution is a set of rhetorical practices within academic communities rather than simply a set of mechanics that stave off plagiarism charges. In consultations and classes, we see similar confusion about when citations are expected and how they function, with many students thinking they should appear only after direct quotations or close paraphrases rather than understanding that citations also act as authority cues and as portals into further reading for future researchers.
Supporting Student Developing in Using Sources
As with the Use of Evidence category above, the weaker papers in this category signaled confusion about the underlying purpose for bringing evidence and outside sources into papers. This signals a need for librarians, professors, and writing center professionals to explicitly discuss with students the reasons behind citation — its function within rhetoric — rather than simply the mechanics of quoting, paraphrasing, and creating proper citations.
Novices in academic writing often benefit from explicit instruction in the ways that academic writing draws on communication conventions that they already know but may not have recognized in the unfamiliar genres they’re reading and writing about. Especially with our first and second year students, we give them examples of the types of conversations they might have with a friend and point out that it would be awkward if one conversation partner simply repeated everything the other person was saying. Instead, in conversations each person builds on what the other person has said to generate new meaning or knowledge. For our upper level students, we teach them that the point of research is not to create a collection of statements as proof that they have read broadly, but rather to focus on the work of finding connections, selecting key sources, and remaining flexible about their thinking so that they can remain responsive to what they’re learning. Once a student gains a somewhat clear understanding of their topic, we then encourage them to identify any themes that emerge in the reading that would cause them to ask new questions, and we teach them to look for any small clues in their readings that indicate points where experts approach the topic differently, build on each other, push against each other, and in doing so make space for their contribution to human knowledge. This in turn helps students see that they can create space for themselves in the scholarly conversation — that they can join the conversation themselves by engaging with their sources rather than simply reporting on them.
Sometimes, a student’s ability to concentrate on finding good sources is complicated by the restrictions of their class assignments. We often see assignments where students are required to find an exact number of different source types, such as two peer-reviewed journal articles, a book, and a news source. This gives librarians an opportunity to teach students about these types of sources, where to find them, and how to recognize them, and how they function in scholarly communication. However, it does not always help a student to fully develop their own argument or ask more complex questions about their topic because they are consumed with making sure they are checking all the boxes of the assignment requirements. Sometimes these requirements and constraints can help steer students toward topics or approaches that have well-matched sources available, but other times the writing prompt and its source requirements can be at odds with each other. In either case, students can learn to navigate the challenges of their assignments if they understand that not every topic can be fully explored only through peer reviewed academic journals. Part of what they’re learning to do is scope their topics more appropriately, whether more broadly or narrowly, and to work within (or push productively against) disciplinary conventions about appropriate source types.
As students grapple with the difficulties of entering a community of academic practice, another challenge they face has to do with attribution practices within the various disciplines. Carleton students report worrying about accidentally plagiarizing, but they lack nuanced understandings of citation norms within each discipline and sub-discipline.37 This combined with not knowing that there is flexibility within citation styles to make citing decisions based on overall best practices, is a major stumbling block. The act of citing is often seen by students as something boring and mechanical, a check-box to mark. In fact, as Robert Connors has noted, “citations have an essentially rhetorical nature” that contain a “universe of meanings” and that are the “products and reflections of social and rhetorical realities.”38 Citations function within scholarly conversation to help readers evaluate the claims at hand, help authors position themselves within the field, and point readers to related conversations in the literature.39 Through our conversations with students in research consultations, we have found that students often mistake the various citation formats for arbitrary sets of rules, not understanding that each style matches a discourse community’s communication priorities and strategies.
In our experience, shifting the conversation toward these underlying goals of attribution and away from punitive and mechanistic tutorials helps students both make better choices in their citation practices and participate more fully in their scholarly community. Each year we conduct training sessions with peer tutors in the Writing Center during which we discuss these concepts, and each year those peer tutors report that this was one of the most useful and eye-opening topics discussed during our training sessions. Similarly, two quick questions have helped students in research consultations decide whether something they’ve written is “common knowledge” and therefore doesn’t need a citation: “Might my reader not automatically agree with this? Might my reader be curious to know more about this?” If the answer to either question is “yes” then a citation is useful. Countless instruction sessions and research consultations over the years have dealt with similar themes, and students report similar feelings of empowerment (and sometimes even excitement) about attribution practices once they understand the many ways that citations can function in rhetoric.
Our Information Literacy in Student Writing project and the scholarship in librarianship, information literacy, and rhetoric have shown us that there are a lot of opportunities to continue working with faculty, academic support staff, and students to assist with source selection and use. We also know from our research and first-hand experience that these practices are quite difficult and require just that: practice. Library instruction can help over time and one of the advantages of having faculty score papers for our ILSW project with us is that they see concrete evidence that evaluation and source selection are challenges for students. Results from our ILSW findings have opened up further opportunities with a number of faculty to provide more information literacy instruction and consultation to their students.40 We think it is also important to acknowledge the reality that the context in which our students find the majority of their research, the internet (including databases, online catalogs, Google Scholar, etc.), is only going to make evaluation more complicated. Information that is born-digital does not fit into the neat containers we could hold in our hands and more easily identify, nor does all information correspond to a physical format anymore. For these reasons, librarians, faculty, and other academic support staff should discuss source evaluation and selection with students and equip them with strategies.
This paper was only possible because of a whole community of people. We can’t possibly name everyone who has shaped our work and the ILSW project, but we would like to particularly acknowledge the contributions of:
- All the students who generously made their Sophomore Portfolios available for research
- Members of Gould Library’s Reference & Instruction Department, for conceiving of this project and carrying it through from year to year. Especially Matt Bailey, Sarah Calhoun, Audrey Gunn, Susan Hoang, Sean Leahy, Danya Leebaw, Kristin Partlo, Charlie Priore, Carolyn Sanford, Heather Tompkins, and Ann Zawistoski.
- Carleton’s Gould Library, especially College Librarian Brad Schaffner, for supporting this project
- Carleton’s Dean of the College office, particularly Bev Nagel, George Shuffelton, and Danette DeMann for approving, supporting, and funding the 2018 ILSW study
- Carol Trossett, data consultant extraordinaire
- Carleton’s Perlman Center for Teaching and Learning, especially Melissa Eblen-Zayas, for invaluable support and advice
- Carleton’s Writing Across the Curriculum program, especially the director, George Cusack and Mary Drew, for access to the Sophomore Writing Portfolio papers and for so many other radical acts of collaboration
- Carleton’s Office of Institutional Research and Assessment, especially Jody Friedow and Bill Altermatt, for access to sophomore student demographic reports
- All the faculty and staff who participated in the ILSW project’s reading days
- The reviewers who read and provided feedback on drafts of this paper. Thank you for your time and insights Ian Beilin and Amy Mars.
ACRL. “Framework for Information Literacy for Higher Education.” Chicago: Association of College and Research Libraries, 2016. https://doi.org/10.1080/00049670.1995.10755718.
Bakhtin, M. M. Speech Genres and Other Late Essays. Edited by Michael Holquist and Caryl Emerson. Translated by Vern W. McGee. 1st edition. University of Texas Press Slavic Series 8. Austin: University of Texas Press, 1986.
Bazerman, Charles. “Systems of Genres and the Enactment of Social Intentions.” In Genre and the New Rhetoric, edited by Aviva Freedman and Peter Medway, 69–85. London: Taylor & Francis, 2005.
Bizup, Joseph. “BEAM: A Rhetorical Vocabulary for Teaching Research-Based Writing.” Rhetoric Review 27, no. 1 (January 4, 2008): 72–86. https://doi.org/10.1080/07350190701738858.
Breakstone, Joel, Sarah McGrew, Mark Smith, Teresa Ortega, and Sam Wineburg. “Why We Need a New Approach to Teaching Digital Literacy.” Phi Delta Kappan 99, no. 6 (2018): 27–32. https://doi.org/10.1177/0031721718762419.
Brent, Doug. Reading as Rhetorical Invention: Knowledge, Persuasion, and the Teaching of Research-Based Writing. Urbana, Ill.: National Council of Teachers of English, 1992.
Buhler, Amy, and Tara Cataldo. “Identifying E-Resources: An Exploratory Study of University Students.” Library Resources & Technical Services 60, no. 1 (January 7, 2016): 23–37. https://doi.org/10.5860/lrts.60n1.23.
Buhler, Amy G, Ixchel M Faniel, Brittany Brannon, Christopher Cyr, Tara Tobin, Lynn Silipigni Connaway, Joyce Kasman Valenza, et al. “Container Collapse and the Information Remix: Students’ Evaluations of Scientific Research Recast in Scholarly vs. Popular Sources.” In ACRL Proceedings, 14. Cleveland, Ohio, 2019.
Bull, Alaina C., and Alison Head. “Dismantling the Evaluation Framework – In the Library with the Lead Pipe,” July 21, 2021. https://www.inthelibrarywiththeleadpipe.org/2021/dismantling-evaluation/.
Calhoun, Cate. “Using Wikipedia in Information Literacy Instruction: Tips for Developing Research Skills.” College & Research Libraries News 75, no. 1 (2014): 32–33. https://doi.org/10.5860/crln.75.1.9056.
Connaway, Lynn Silipigni. “What Is ‘Container Collapse’ and Why Should Librarians and Teachers Care? – OCLC Next.” Next (blog), June 20, 2018. http://www.oclc.org/blog/main/what-is-container-collapse-and-why-should-librarians-and-teachers-care/.
Connors, Robert J. “The Rhetoric of Citation Systems Part I The Development of Annotation Structures from the Renaissance to 1900.” Rhetoric Review 17, no. 1 (1998): 6–48. https://doi.org/10.1080/07350199809359230.
Cusack, George. “Writing Across the Curriculum.” Carleton College, 2018. https://apps.carleton.edu/campus/writingprogram/.
Daniels, Erin. “Using a Targeted Rubric to Deepen Direct Assessment of College Students’ Abilities to Evaluate the Credibility of Sources.” College & Undergraduate Libraries 17, no. 1 (2010): 31. https://doi.org/10.1080/10691310903584767.
Gullifer, Judith, and Graham A. Tyson. “Exploring University Students’ Perceptions of Plagiarism: A Focus Group Study.” Studies in Higher Education 35, no. 4 (June 1, 2010): 463–81. https://doi.org/10.1080/03075070903096508.
Hofer, Barbara K. “Personal Epistemology as a Psychological and Educational Construct: An Introduction.” In Personal Epistemology: The Psychology of Beliefs about Knowledge and Knowing, 3–14. London: Routledge, 2004.
Hyland, Ken. “Academic Attribution: Citation and the Construction of Disciplinary Knowledge.” Applied Linguistics 20, no. 3 (1999): 341–67.
Jastram, Iris, Danya Leebaw, and Heather Tompkins. “CSI(L) Carleton: Forensic Librarians and Reflective Practices.” In the Library with the Lead Pipe, 2011. https://www.inthelibrarywiththeleadpipe.org/2011/csil-carleton-forensic-librarians-and-reflective-practices/.
———. “Situating Information Literacy Within the Curriculum: Using a Rubric to Shape a Program.” Portal: Libraries and the Academy 14, no. 2 (2014): 165–86. https://doi.org/10.1353/pla.2014.0011.
Leebaw, Danya, Kristin Partlo, and Heather Tompkins. “‘How Is This Different from Critical Thinking?’: The Risks and Rewards of Deepening Faculty Involvement in an Information Literacy Rubric,” 270–80. Indianapolis: ACRL 2013, 2013. http://www.ala.org/acrl/sites/ala.org.acrl/files/content/conferences/confsandpreconfs/2013/papers/LeebawPartloTompkins_HowIsThis.pdf.
Leeder, Chris. “Student Misidentification of Online Genres.” Library & Information Science Research 38, no. 2 (April 2016): 125–32. https://doi.org/10.1016/j.lisr.2016.04.003.
Lloyd, Annemaree. Information Literacy Landscapes: Information Literacy in Education, Workplace, and Everyday Contexts. Oxford: Chandos Publishing, 2010.
McGeough, Ryan, and C. Kyle Rudick. “‘It Was at the Library; Therefore It Must Be Credible’: Mapping Patterns of Undergraduate Heuristic Decision-Making.” Communication Education 67, no. 2 (April 3, 2018): 165–84. https://doi.org/10.1080/03634523.2017.1409899.
Meriam Library. “Is This Source or Information Good?,” 2010. https://library.csuchico.edu/help/source-or-information-good.
Miller, Carolyn R. “Genre as Social Action.” Quarterly Journal of Speech 70, no. 2 (May 1, 1984): 151–67. https://doi.org/10.1080/00335638409383686.
Reference, Gould Library, and Instruction Department. “Research Practices Survey 2015-16.” Northfield MN: Gould Library, Carleton College, 2017. https://digitalcommons.carleton.edu/libr_staff_faculty/16/.
Russo, Alyssa, Amy Jankowski, Stephanie Beene, and Lori Townsend. “Strategic Source Evaluation: Addressing the Container Conundrum.” Reference Services Review 47, no. 3 (August 1, 2019): 294–313. https://doi.org/10.1108/RSR-04-2019-0024.
Simmons, Michelle Holschuh. “Librarians as Disciplinary Discourse Mediators: Using Genre Theory to Move Toward Critical Information Literacy.” Portal: Libraries and the Academy 5, no. 3 (2005): 297–311. http://muse.jhu.edu/journals/portal_libraries_and_the_academy/v005/5.3simmons.html.
Soules, Aline. “E-Books and User Assumptions.” Serials: The Journal for the Serials Community 22, no. 3 (January 1, 2009): S1–5. https://doi.org/10.1629/22S1.
White, Beth A., Taimi Olsen, and David Schumann. “A Threshold Concept Framework for Use across Disciplines.” In Threshold Concepts in Practice, edited by Ray Land, Jan H. F. Meyer, and Michael T. Flanagan, 53–63. Educational Futures. Rotterdam: SensePublishers, 2016. https://doi.org/10.1007/978-94-6300-512-8_5.
Appendix: ILSW Rubric
- The ILSW project uses a scoring rubric (see appendix) that is designed for use across disciplines, and it is intended to be flexible across many paper genres. It does not reveal specifics about student research strategies, but it does allow us to identify characteristics of information literacy habits of mind as they appear in completed student writing. The rubric calls attention to the clues students give their readers about how the students conceive of their research strategies and how they marshal and deploy evidence in service of their rhetorical goals. Our full rubric rubric, scoring sheet, and coder’s manual are available at https://go.carleton.edu/ilsw [↩]
- Iris Jastram, Danya Leebaw, and Heather Tompkins, “Situating Information Literacy Within the Curriculum: Using a Rubric to Shape a Program,” Portal: Libraries and the Academy 14, no. 2 (2014): 165–86, https://doi.org/10.1353/pla.2014.0011; Danya Leebaw, Kristin Partlo, and Heather Tompkins, “‘How Is This Different from Critical Thinking?’: The Risks and Rewards of Deepening Faculty Involvement in an Information Literacy Rubric” (Indianapolis: ACRL 2013, 2013), 270–80, http://www.ala.org/acrl/sites/ala.org.acrl/files/content/conferences/confsandpreconfs/2013/papers/LeebawPartloTompkins_HowIsThis.pdf. [↩]
- Information about the Sophomore Writing Portfolio can be found at https://www.carleton.edu/writing/portfolio/. [↩]
- More about Carleton College can be found in this profile: https://web.archive.org/web/20210802183721/https://www.carleton.edu/about/carleton-at-a-glance/ [↩]
- George Cusack, “Writing Across the Curriculum,” Carleton College, 2018, https://apps.carleton.edu/campus/writingprogram/. [↩]
- 25% of the papers were read twice to provide inter-rater reliability scores. Among the group of readers, librarians had relatively high levels of inter-rater reliability (73% agreement). Faculty disagreed on scores more frequently (54% agreement), possibly due to having less experience with the project or possibly due to their experiences grading papers according to how well the papers meet the requirements of their assignments rather than evaluating according to our rubric, which is not related to a particular assignment. However even with these differences, there were clear trends that emerged from the comments and from the statistically significant differences in the data. [↩]
- We did not include any measurement of whether students in this sample had had any library instruction or experience. [↩]
- see Annemaree Lloyd, Information Literacy Landscapes: Information Literacy in Education, Workplace, and Everyday Contexts (Oxford: Chandos Publishing, 2010). [↩]
- ACRL, “Framework for Information Literacy for Higher Education” (Chicago: Association of College and Research Libraries, 2016), https://doi.org/10.1080/00049670.1995.10755718. [↩]
- M. M. Bakhtin, Speech Genres and Other Late Essays, ed. Michael Holquist and Caryl Emerson, trans. Vern W. McGee, 1st edition., University of Texas Press Slavic Series 8 (Austin: University of Texas Press, 1986), 41. [↩]
- Carolyn R. Miller, “Genre as Social Action,” Quarterly Journal of Speech 70, no. 2 (May 1, 1984): 163, https://doi.org/10.1080/00335638409383686. [↩]
- see Bakhtin, Speech Genres and Other Late Essays; Miller, “Genre as Social Action.” [↩]
- Barbara K Hofer, “Personal Epistemology as a Psychological and Educational Construct: An Introduction,” in Personal Epistemology: The Psychology of Beliefs about Knowledge and Knowing (London: Routledge, 2004), 3. [↩]
- Jan Meyer & Ray Land, “Threshold Concepts and Troublesome Knowledge: Linkages To Ways of Thinking and Practicing Within the Disciplines.” (Enhancing Teaching-Learning Environments in Undergraduate Courses, Occasional Report 4; Universities of Edinburgh, Coventry and Durham; May 2003), http://www.etl.tla.ed.ac.uk/docs/ETLreport4.pdf. [↩]
- Beth A. White, Taimi Olsen, and David Schumann, “A Threshold Concept Framework for Use across Disciplines,” in Threshold Concepts in Practice, ed. Ray Land, Jan H. F. Meyer, and Michael T. Flanagan, Educational Futures (Rotterdam: SensePublishers, 2016), 53, https://doi.org/10.1007/978-94-6300-512-8_5. [↩]
- “Framework for Information Literacy for Higher Education.” [↩]
- Meriam Library, “Is This Source or Information Good?,” 2010, https://library.csuchico.edu/help/source-or-information-good. [↩]
- Ryan McGeough and C. Kyle Rudick, “‘It Was at the Library; Therefore It Must Be Credible’: Mapping Patterns of Undergraduate Heuristic Decision-Making,” Communication Education 67, no. 2 (April 3, 2018): 165–84, https://doi.org/10.1080/03634523.2017.1409899. [↩]
- ACRL, “Framework for Information Literacy for Higher Education.” [↩]
- Joel Breakstone et al., “Why We Need a New Approach to Teaching Digital Literacy,” Phi Delta Kappan 99, no. 6 (2018): 27–32, https://doi.org/10.1177/0031721718762419; Alaina C. Bull and Alison Head, “Dismantling the Evaluation Framework – In the Library with the Lead Pipe,” July 21, 2021, https://www.inthelibrarywiththeleadpipe.org/2021/dismantling-evaluation/. [↩]
- Private internal report of Research Practices Survey administered by HEDS (the Higher Education Data Sharing Consortium) results prepared by Carol Trosset in September 2015. [↩]
- Gleaned from conversations with our liaison faculty and from conversations during internal professional development workshops, often led by the Learning and Teaching Center. [↩]
- Amy G Buhler et al., “Container Collapse and the Information Remix: Students’ Evaluations of Scientific Research Recast in Scholarly vs. Popular Sources,” in ACRL Proceedings (Association of College and Research Libraries, Cleveland, Ohio, 2019), 14; Lynn Silipigni Connaway, “What Is ‘Container Collapse’ and Why Should Librarians and Teachers Care? – OCLC Next,” Next (blog), June 20, 2018, http://www.oclc.org/blog/main/what-is-container-collapse-and-why-should-librarians-and-teachers-care/. [↩]
- We know this from the Research Practices Survey, conducted at Carleton in 2006 and again in 2015. This survey contains many measures of these changes in pre-college experience. While we are a highly selective institution, our students’ pre-college experiences with research have been diminishing over time. You can see some of the RPS results at “Research Practices Survey 2015-16” (Northfield MN: Gould Library, Carleton College, 2017), https://digitalcommons.carleton.edu/libr_staff_faculty/16/. [↩]
- Amy Buhler and Tara Cataldo, “Identifying E-Resources: An Exploratory Study of University Students,” Library Resources & Technical Services 60, no. 1 (January 7, 2016): 33, https://doi.org/10.5860/lrts.60n1.23; Chris Leeder, “Student Misidentification of Online Genres,” Library & Information Science Research 38, no. 2 (April 2016): 129, https://doi.org/10.1016/j.lisr.2016.04.003. [↩]
- Leeder, “Student Misidentification of Online Genres,” 129. [↩]
- Aline Soules, “E-Books and User Assumptions,” Serials: The Journal for the Serials Community 22, no. 3 (January 1, 2009): S4, https://doi.org/10.1629/22S1. [↩]
- Charles Bazerman, “Systems of Genres and the Enactment of Social Intentions,” in Genre and the New Rhetoric, ed. Aviva Freedman and Peter Medway (London: Taylor & Francis, 2005), 69. [↩]
- Erin Daniels, “Using a Targeted Rubric to Deepen Direct Assessment of College Students’ Abilities to Evaluate the Credibility of Sources,” College & Undergraduate Libraries 17, no. 1 (2010): 35, https://doi.org/10.1080/10691310903584767. [↩]
- Alyssa Russo et al., “Strategic Source Evaluation: Addressing the Container Conundrum,” Reference Services Review 47, no. 3 (August 1, 2019): 294–313, https://doi.org/10.1108/RSR-04-2019-0024. [↩]
- Michelle Holschuh Simmons, “Librarians as Disciplinary Discourse Mediators: Using Genre Theory to Move Toward Critical Information Literacy,” Portal: Libraries and the Academy 5, no. 3 (2005): 297–311, http://muse.jhu.edu/journals/portal_libraries_and_the_academy/v005/5.3simmons.html. [↩]
- Leeder, “Student Misidentification of Online Genres,” 129. [↩]
- Cate Calhoun, “Using Wikipedia in Information Literacy Instruction: Tips for Developing Research Skills,” College & Research Libraries News 75, no. 1 (2014): 32–33, https://doi.org/10.5860/crln.75.1.9056. [↩]
- Joseph Bizup, “BEAM: A Rhetorical Vocabulary for Teaching Research-Based Writing,” Rhetoric Review 27, no. 1 (January 4, 2008): 75, https://doi.org/10.1080/07350190701738858. [↩]
- Doug Brent, Reading as Rhetorical Invention: Knowledge, Persuasion, and the Teaching of Research-Based Writing (Urbana, Ill.: National Council of Teachers of English, 1992). [↩]
- Anecdotally, when the reference and instruction librarians met with Writing Center workers in fall 2019 to discuss how we can better work together, one point of common ground was the desire to support students who struggle to get their own voice into their writing rather than relying on sources. [↩]
- Judith Gullifer and Graham A. Tyson, “Exploring University Students’ Perceptions of Plagiarism: A Focus Group Study,” Studies in Higher Education 35, no. 4 (June 1, 2010): 463–81, https://doi.org/10.1080/03075070903096508. [↩]
- Robert J Connors, “The Rhetoric of Citation Systems Part I The Development of Annotation Structures from the Renaissance to 1900,” Rhetoric Review 17, no. 1 (1998): 6–7, https://doi.org/10.1080/07350199809359230. [↩]
- See Ken Hyland, “Academic Attribution: Citation and the Construction of Disciplinary Knowledge,” Applied Linguistics 20, no. 3 (1999): 342–44. [↩]
- Iris Jastram, Danya Leebaw, and Heather Tompkins, “CSI(L) Carleton: Forensic Librarians and Reflective Practices,” In the Library with the Lead Pipe, 2011, https://www.inthelibrarywiththeleadpipe.org/2011/csil-carleton-forensic-librarians-and-reflective-practices/. [↩]