Using a Proposed Library Guide Assessment Standards Rubric and a Peer Review Process to Pedagogically Improve Library Guides: A Case Study

In Brief

Library guides can help librarians provide information to their patrons regarding their library resources, services, and tools. Despite their perceived usefulness, there is little discussion in designing library guides pedagogically by following a set of assessment standards for a quality-checked review. Instructional designers regularly use vetted assessment standards and a peer review process for building high-quality courses, yet librarians typically do not when designing library guides. This article explores using a set of standards remixed from SUNY’s Online Course Quality Review Rubric or OSCQR and a peer review process. The authors used a case study approach to test the effectiveness of building library guides with the proposed standards by tasking college students to assess two Fake News guides (one revised to meet the proposed standards). Results indicated most students preferred the revised library guide to the original guide for personal use. The majority valued the revised guide for integrating into a learning management system and perceived it to be more beneficial for professors to teach from. Future studies should replicate this study and include additional perspectives from faculty and how they perceive the pedagogical values of a library guide designed following the proposed rubric.

A smiling librarian assists a student who is sitting at a computer located within the library.

Image: “Helpful”. Digital image created with Midjourney AI. By Trina McCowan CC BY-NC-SA 4.0


Library guides or LibGuides are a proprietary publishing tool for libraries and museums created by the company Springshare; librarians can use LibGuides to publish on a variety of topics centered around research (Dotson, 2021; Springshare, n. d.). For consistency, the authors will use the term library guides moving forward. Librarians can use Springshare’s tool to publish web pages to educate users on library subjects, topics, procedures, or processes (Coombs, 2015). Additionally, librarians can work with teaching faculty to create course guides that compile resources for specific classes (Berić-Stojšić & Dubicki, 2016; Clever, 2020). According to Springshare (n. d.), library guides are widely used by academic, museum, school, and public libraries; approximately 130,000 libraries worldwide use this library tool (Springshare, n. d.). The library guides’ popularity and continued use may stem from their ease of use as it eliminates the need to know a coding language to develop online content. (Bergstrom-Lynch, 2019).

Baker (2014) described library guides as the “evolutionary descendants of library pathfinders” (p. 108). The first pathfinders were paper brochures that provided suggested resources for advanced research. Often, librarians created these tools for the advanced practitioner as patrons granted access to the library were researchers and seasoned scholars. As the end users were already experts, there was little need for librarians to provide instruction for using the resources (Emanuel, 2013). Later, programs such as MIT’s 1970s Project Intrex developed pathfinders that presented students with library resources in their fields of interest (Conrad & Stevens, 2019). As technology advanced, librarians created and curated pathfinders for online access (Emanuel, 2013). 

Today, due to the modernization of pathfinders as library guides and their ease of discoverability, students and unaffiliated online users often find these guides without the assistance of a librarian (Emanuel, 2013). Search engines such as Google can extend a library guide’s reach far beyond a single institutional website, drawing the attention of information experts and novice internet users alike (Brewer et al., 2017; Emanuel, 2013; Lauseng et al., 2021). This expanded access means a librarian will not always be present to help interpret and explain the library guide’s learning objectives. Stone et al. (2018) state that library guides should be built using pedagogical principles “where the guide walks the student through the research process” (p. 280). Bergstrom-Lynch (2019) argues that there has been an abundant focus on user-centered library design studies but little focus on learning-centered design. Bergstrom-Lynch (2019) advocates for more attention directed to learning-centered design principles as library guides are integrated into Learning Management Systems (LMS) such as Canvas and Blackboard (Berić-Stojšić & Dubicki, 2016; Bielat et al., 2013; Lauseng et al., 2021) and can be presented as a learning module for the library (Emanuel, 2013; Mann et al., 2013). The use of library guides as online learning and teaching tools is not novel; however, their creation and evaluation using instructional design principles are a recent development (Bergstrom-Lynch, 2019). 

A core component of an instructional designer’s job is to ensure that online course development meets the institution’s standards for quality assurance (Halupa, 2019). Instructional designers can aid with writing appropriate course and learning objectives and selecting learning activities and assessments that align back to the module’s objectives. Additionally, they can provide feedback on designing a course that is student-friendly—being mindful of cognitive overload, course layout, font options, and color selection. Additionally, instructional designers are trained in designing learning content that meets accessibility standards (Halupa, 2019).

Instructional design teams and teaching faculty can choose from a variety of quality assurance standards rubrics to reference to ensure that key elements for online learning are present in the online course environment. Examples of quality assurance tools include Quality Matters (QM) Higher Education Rubric and SUNY’s Online Course Quality Review Rubric or OSCQR, a professional development course refreshment process with a rubric (Kathuria & Becker, 2021; OSCQR-SUNY, n.d.). QM is a not-for-profit subscribing service that provides education on assessing online courses through the organization’s assessment rubric of general and specific standards (Unal & Unal, 2016). The assessment process is a “collegial, faculty-driven, research-based peer review process…” (Unal & Unal, 2016, p. 464). For a national review, QM suggests three reviewers certified and trained with QM to conduct a quality review. There should be a content specialist and one external reviewer outside of the university involved in the process (Pickens & Witte, 2015). Some universities, such as the University of North Florida, submit online courses for a QM certificate with High-Quality recognition or an in-house review based on the standards earning High-Quality designation. For an in-house review at UNF, a subject matter expert, instructional designer, and trained faculty reviewer assess the course to provide feedback based on the standards (CIRT, “Online Course Design Quality Review”, n. d.; Hulen, 2022). Instructional designers at some institutions may use other pedagogical rubrics that are freely available and not proprietary. OSCQR is an openly licensed online course review rubric that allows use and/or adaptation (OSCQR-SUNY, n. d.). SUNYY-OSCQR’s rubric is a tool that can be used as a professional development exercise when building and/or refreshing online courses (OSCQR-SUNY, n.d.).

Typically, library guides do not receive a vetted vigorous pedagogical peer review process like online courses. Because library guides are more accessible and are used as teaching tools, they should be crafted for a diverse audience and easy for first-time library guide users to understand and navigate (Bergstrom-Lynch, 2019; Smith et al., 2023). However, Conrad & Stevens (2019) state: “Inexperienced content creators can inadvertently develop guides that are difficult to use, lacking consistent templates and containing overwhelming amounts of information” (p. 49). Lee et al. (2021) reviewed library guides about the systematic review process. Although this topic is complex, Lee et al. (2021) noted that there was a lack of instruction about the systematic review process presented. If instructional opportunities are missing from the most complex topics, one may need to review all types of library guides with fresh eyes. 

Moukhliss aims to describe a set of quality review standards, the Library Guide Assessment Standards (LGAS) rubric with annotations that she created based on the nature of library guides, and by remixing the SUNY-OSCQR rubric. Two trained reviewers are recommended to work with their peer review coordinator to individually engage in the review process and then convene to discuss the results. A standard will be marked Met when both of the reviewers mark it as Met, noting the evidence to support the Met designation. In order for a standard to be marked as Met, the library guide author should show evidence of 85% accuracy or higher per standard. To pass the quality-checked review to receive a quality-checked badge, the peer review team should note that 85% of the standards are marked as “Met.” If the review fails, the library guide author may continue to edit the guide or publish the guide without the quality-checked badge. Details regarding the peer review process are shared in the Library Guide Assessment Standards for Quality-Checked Review library guide. Select the Peer Review Training Materials tab for the training workbook and tutorial.

Situational Context

The University of North Florida (UNF) Thomas G. Carpenter Library services an R2 university of approximately 16,500 students. The UNF Center for Instruction and Research Technology (CIRT) supports two online learning librarians. The online librarians’ roles are to provide online instruction services to UNF faculty. CIRT staff advocate for online teaching faculty to submit their online courses to undergo a rigorous quality review process. Faculty can obtain a High-Quality designation for course design by working with an instructional designer and an appointed peer reviewer from UNF, or they may opt to aim for a High-Quality review after three years of course implementation by submitting for national review with Quality Matters (Hulen, 2022). Currently, Moukhliss serves as a peer reviewer for online High-Quality course reviews. 

After several High-Quality course reviews, Moukhliss questioned why there are no current vetted review standards for the various types of library guides reviewed and completed by trained librarians as there are for online courses and thus borrowed from The SUNY Online Course Quality Review Rubric OSCQR to re-mix as the Library Guide Assessment Standards Rubric with annotations

Literature Review

The amount of peer-reviewed literature on library guide design is shockingly small considering how many library guides have been created. The current research focus has been on usability and user experience studies, although some researchers have begun to focus on instructional design principles. As Bergstrom-Lynch (2019) states, peer-reviewed literature addressing library guide design through the lens of instructional design principles is at a stage of infancy. Researchers have primarily focused on collecting data on usage and usability (Conrad & Stevens, 2019; Oullette, 2011; Quintel, 2016). German (2017), an instructional design librarian, argues that when the library guide is created and maintained through a learner-centered point of view, librarians will see library guides as “e-learning tools” (p. 163). Lee et al. (2021) noted the value of integrating active learning activities into library guides. Stone et al. (2018) conducted a comparison study between two library guides, one library guide as-is and the other re-designed with pedagogical insight. Stone et al. (2018) concluded that “a pedagogical guide design, organizing resources around the information literacy research process and explaining the ‘why’ and ‘how of the process, leads to better student learning than the pathfinder design” (p. 290). A library guide representative of a pathfinder design lists resources rather than explaining them. Lee and Lowe (2018) conducted a similar study and noted more user interaction when viewing the pedagogically designed guide vs. the guide not designed with pedagogical principles. Hence Stone (2018) and Lee and Lowe (2018) discovered similar findings.

Authors like German (2017) and Lee et al. (2021) have touched upon instructional design topics. For example, Adebonojo (2010) described aligning the content of a subject library guide to library sources shared in course syllabi. Still, the author does not expand to discuss any other instructional design principles. Bergstrom-Lynch (2019) thinks more comprehensively, advocating for the use of the ADDIE instructional design model (an acronym for Analysis, Design, Development, Implementation, and Evaluation) when building library guides. The analysis phase encourages the designer to note problems with current instruction. The design phase entails how the designer will rectify the learning gap from the analysis phase. The development phase entails adding instructional materials, activities, and assessments. The implementation phase involves introducing the materials to learners. The evaluation phase enables the designer to collect feedback and improve content based on suggestions. ADDIE is cyclical and iterative (Bergstrom-Lynch, 2019). Allen (2017) introduces librarians to instructional design theories in the context of building an online information literacy asynchronous course but does not tie in using these theories for building library guides.

As Bergstrom-Lynch (2019) focused on best practices for library guide design based on ADDIE, German et al. (2017) used service design thinking constructs to build effective instruction guides. The five core principles of service design thinking are “user-centered, co-creative, sequencing, evidencing, and holistic” (German et al., 2017, p. 163). Focusing on the user encourages the designer to think like a student and ask: What do I need to know to successfully master this content? The co-creator stage invites other stakeholders to add their perspectives and/or expertise to the guide. The sequencing component invites the librarian to think through the role of the librarian and library services before, during, and after instruction. German et al. (2017) advocates for information from each stage to be communicated in the library guide. Evidencing involves the librarian reviewing the library guide to ensure that the content aligns with the learning objective (German et al., 2017). Both authors advocate for instructional design methods but fall short of suggesting an assessment rubric for designing and peer-reviewing guides.

Smith et al. (2023) developed a library guide rubric for their library guide redesign project at the Kelvin Smith Library at Case Western Reserve University. This rubric focused heavily on accessibility standards using the Web Accessibility Evaluation Tool or WAVE. Although Smith et al. (2023) discuss a rubric, the rubric was crafted as an evaluation tool for the author of the guide rather than for a peer review process. 

Although Bergstrom-Lynch (2019), German et al. (2017), and Smith et al. (2023) are pioneering best practices for library guides, they take different approaches. For example, Bergstrom-Lynch (2019) presents best practices for cyclical re-evaluation of the guide based on instructional design principles and derives their best practices based on their usability studies. The Smith et al. (2023) rubric emphasizes accessibility standards for ADA compliance—essential for course designers but a component of a more comprehensive rubric. German et al. (2017) emphasizes a user-centered design through the design thinking method. Moukhliss intends to add to the literature by suggesting using a remix of a vetted tool that course developers use as a professional development exercise with faculty. This OSCQR-SUNY tool envelopes the varying perspectives of Bergstrom-Lynch (2019), Smith et al. (2023), and German et al. (2017). 

Strengths & Weaknesses of the Library Guide

As with any tool, library guides have their strengths and weaknesses. Positives include indications that library guides can play a positive role in improving students’ grades, retention, and overall research skills (Brewer et al., 2017; May & Leighton, 2013; Wakeham et al., 2012). Additionally, library guides are easy to build and update (Baker, 2014; Conrad & Stevens, 2019). They can accommodate RSS feeds, videos, blogs, and chat (Baker, 2014), are accessible to the world, and cover a vast range of library research topics. According to Lauseng et al. (2021), library guides are discoverable through Googling and integrated into online Learning Management Systems (LMS). These factors support the view that library guides hold educational value and should be reconsidered for use as an Open Education Resource (Lauseng et al., 2021).

However, there are no perfect educational tools. Library guide weaknesses include their underutilization largely due to students not knowing what they are or how to find them (Bagshaw & Yorke-Barber, 2018; Conrad & Stevens, 2019; Ouellette, 2011). Additionally, library guides can be difficult for students to navigate, contain unnecessary content, and overuse library jargon (Sonsteby & DeJonghe, 2013). Conrad & Stevens (2019) described a usability study where the students were disoriented when using library guides and reported that they did not understand their purpose, function, or how to return to the library homepage. Lee et al. (2021) and Baker (2014) suggest that librarians tend to employ the “kitchen sink” (Baker, 2014, p. 110) approach to build library guides, thus overloading the guide with unapplicable content.

Critical Pedagogy and Library Guides

In his publication titled “The Philosophy of the Oppressed,” Paulo Freire introduced the theory of critical pedagogy and asserted that most educational models have the effect of reinforcing systems of societal injustice through the assumption that students are empty vessels who need to be filled with knowledge and skills curated by the intellectual elite (Kincheloe, 2012; Downey, 2016). Early in the 21st century, information professionals built upon the belief that “Critical pedagogy is, in essence, a project that positions education as a catalyst for social justice” (Tewell, 2015, p. 26) by developing “critical information literacy” to address what some saw as the Association of College and Research Libraries’ technically sound, but socially unaware “Information Literacy Competency Standards for Higher Education” (Cuevas-Cerveró et al., 2023). In subsequent years, numerous librarians and educators have written about the role of information literacy in dismantling systems of oppression, citing the need to promote “critical engagement with information sources” while recognizing that knowledge creation is a collaborative process in which everyone engages (Downey, 2016, p. 41).

The majority of scholarly output on library guides focus on user-centered design rather than specifically advocating for critical pedagogical methods. Yet there are a few scholars, such as Lechtenberg & Gold (2022), emphasizing how the lack of pedagogical training within LIS programs often results in information-centric library guides rather than learner-centric ones. Their presentation at LOEX 2022 reiterates the importance of user-centered design in all steps of guide creation, including deciding whether a library guide is needed.   

Additionally, the literature demonstrates that library guides are useful tools in delivering critical information literacy instruction and interventions. For instance, Hare and Evanson used a library guide to list open-access sources as part of their Information Privilege Outreach programming for undergraduate students approaching graduation (Hare & Evanson, 2018). Likewise, Buck and Valentino required students in their “OER and Social Justice” course to create a library guide designed to educate faculty about the benefits of open educational resources, partly due to students’ familiarity with the design and functionality of similar research guides (Buck & Valentino, 2018). As tools that have been used to communicate the principles of critical pedagogy, the evaluation of institutional library guides should consider how effectively critical pedagogy is incorporated into their design.  

The Library Guide Assessment Standards (LGAS) Rubric 

For the remixed rubric, Moukhliss changed the term “course” from OSCQR’s original verbiage to “library guide,” and Moukhliss dropped some original standards based on the differences between the expectations for an online course (i.e., rubrics, syllabus, etc.) and a library guide. Likewise, several standards were added in response to the pros and cons of the library guides, as found in the literature. Additionally, Moukhliss wrote annotations to add clarity to each standard for the peer review process. For example, Standard 2 in the remixed LGAS rubric prompts the reviewer to see if the author defines the term library guide since research has indicated that students do not know what library guides are nor how to find them (Bagshaw & Yorke-Barber, 2018; Conrad & Stevens, 2019; Ouellette, 2011). Standard 7 suggests that the librarian provide links to the profiles of other librarian liaisons who may serve the audience using the library guide. Standard 9 prompts the reviewer to see if the library guide links to the library university’s homepage to clarify Conrad & Stevens’s (2019) conundrum that the library guide is not the library homepage. These additional standards were added to ensure that users are provided with adequate information about the nature of library guides, who publishes them, and how to locate additional guides to address the confusion that Conrad & Stevens (2019) noted in their library guide usability study. Additionally, these added standards may be helpful for those who discover library guides through a Google search. 

Moukhliss intends to use the additional standards to provide context about the library guide to novice users, thus addressing the issue of information privilege or the assumption that everyone starts with the same background knowledge. Standard 22 was added to negate adding unnecessary resources to the guide, which Baker (2014) and Conrad & Stevens (2019) cited as a common problem. Standard 27 encourages the use of Creative Commons attribution, as suggested by Lauseng et al. (2021). They found that not only faculty, staff, and students at the University of Illinois Chicago were using their Evidence Based Medicine library guide, but also a wider audience. Recognizing its strong visibility and significant external usage, they considered it a potential candidate for an Open Educational Resource (OER). As library guides are often found without the help of the librarian, Standard 28 suggests that reviewers check that library guide authors provide steps for accessing research tools and databases suggested in the library guide outside of the context of the guide. Providing such information may help to negate Conrad & Stevens’s (2019) findings regarding students’ feelings of disorientation while using a library guide and difficulty navigating to the library homepage from the guide. 

Standard 30 was added so that students have a dedicated Get Help tab explaining the variety of ways the user can contact their library and/or librarians for additional assistance. Standard 31 was re-written so that the user could check for their understanding in a way appropriate for the guide (Lee et al., 2021), such as a low-stakes quiz or poll. Finally, Standard 32 encourages the user to provide feedback regarding the guide’s usefulness, content, design, etc., with the understanding that learning objectives follow an iterative cycle and are not stagnant. Student feedback can help the authoring librarian update and maintain the guide’s relevancy to users and will give students the opportunity to become co-creators of the knowledge they consume.

UNF’s LGAS Rubric for Quality-Checked Review library guide includes an additional tab for a Quality-Checked badge (available on the Maintenance Checklist/Test Your Knowledge tab) and a suggested maintenance checklist (See Maintenance Checklist/ Test Your Knowledge tab) for monthly review, twice-a-year, and yearly reviews. Moukhliss borrowed and remixed the checklist from the Vanderbilt University Libraries (current as of 8/21/2023). The Peer Review Training Materials tab includes a training workbook and training video on the LGAS rubric, the annotations, and the peer review process. Moukhliss provides a Creative Commons license to the library guide to encourage other institutions to reuse and/or remix at the LGAS’s Start Here page

Methodology, Theoretical Model, and Framework

Moukhliss and McCowan used the qualitative case study methodology. Gephart (2004) stated, “Qualitative research is multimethod research that uses an interpretive, naturalistic approach to its subject matter. . . . Qualitative research emphasizes qualities of entities —the processes and meanings that occur naturally” (pp. 454-455). Moukhliss and McCowan selected the exploratory multi-case study so that they could assess multiple student user/learning perspectives when accessing, navigating, and digesting the two library guides. 

The theoretical model used for this study is the Plan-Do-Check-Act cycle. This quality improvement model has evolved with input from Walter Shewart and Dr. Edward Deming (Koehler & Pankowski, 1996). The cycle walks a team through four steps: Plan, Do, Check, and Act. The Plan phase allows time for one to think through problems such as the lack of design standards for library guides. During the “Do” phase, Moukhliss selected and made a remix of the quality review tool SUNY OSCQR. Additionally, she selected a “kitchen sink” (Baker, 2014, p. 10) library guide and redesigned it with the proposed rubric. Moukhliss’s aim was only to remove dead links and/or outdated information when restructuring the guide. The only items deemed outdated were the CRAAP test learning object and selected books from the curated e-book list. The CRAAP test was removed, and no substitution of similar materials was made. The list of selected books was updated in the revised guide to reflect current publications. As Moukhliss restructured the guide, she decided to use tabbed boxes to chunk and sequence information to appease Standards 11, 12, 13, and 15. You may view this restructuring by comparing the original Fake News guide and the revised Fake News guide. The “Do” phase includes Moukhliss recruiting participants to evaluate the two library guides — the original Fake News guide with the Fake News Guide 2 revised to follow the suggested standards and peer review process. Moukhliss and McCowan submitted the library guide study proposal to the Institutional Review Board in November 2023, and the study was marked Exempt. In December 2023, Moukhliss recruited participants by emailing faculty, distributing flyers in the library, posting flyers on display boards, and adding a digital flyer to each library floor’s lightboard display. The librarians added the incentive of 10-dollar Starbucks gift cards to the first 15 students who signed consent forms and successfully completed the 30-minute recorded Zoom session (or until saturation was reached).

Moukhliss interviewed one test pilot (P1) and ten students (P2-P11) for this study and she noted saturation after her seventh interview but continued to ten participants to increase certainty. Although some may view this as a low sample population, the data aligns with the peer-reviewed literature. Hennick & Kaiser (2019) discuss saturation in in-depth interviews and point to Guest et al.’s (2006) study. Guest et al. (2006) determined that after reviewing data from 60 interviews deemed in-depth, they determined that saturation presented itself between Interviews 7-12 “at which point 88% of all themes were developed and 97% of important (high frequency) themes were identified” (Hennick & Kaiser, 2019, para. 5). The Questionnaire framework for this study is centered around Bloom’s Taxonomy. This taxonomy provides a framework of action verbs that align with the hierarchical levels of learning. Bloom’s taxonomy includes verbiage for learning objectives that align with the level of the learning outcomes of remember, understand, apply, analyze, evaluate, and create. McCowan incorporated various levels of Bloom’s Taxonomy as she built the UX script used for this study. Moukhliss interchanged Fake News and Fake News 2 as Guide A and Guide B throughout the interview sessions. After each recorded Zoom session, Moukhliss reviewed the session and recorded the task completion times on the script, recorded the data to the scorecard, and updated data into the qualitative software NVivo. Both script and scorecard are available on the Library Guide Study page. Moukhliss created a codebook with participant information, assigned code names for everyone, and stored the codebook to a password protected file of her work computer to keep identifiable information secure. Moukhliss used the code names Participant 1, Participant 2, Participant 3, etc. and removed all personal identifiers as she prepared to upload the study’s data to a qualitative software system. For coding, the authors chose the NVivo platform, a qualitative assessment tool that can organize data by type (correspondence, observation, and interviews), enable the researcher(s) to easily insert notes in each file, and develop codes to discover themes. Moukhliss coded the interviews based on the LGAS (i.e., Standard 1, 2, 3, etc.). Additional codes were added regarding navigation and content. Moukhliss & McCowan reviewed the codes for themes and preferences regarding library guide design.

The “Check” phase guided Moukhliss and McCowan in considering the implementation of the LGAS rubric and peer review process for library guides at UNF. During this phase, they reviewed participants’ qualitative responses to the Fake News library guide and the Fake News 2 library guide. Data from the “Check” phase will drive Moukhliss & McCowan to make recommendations in the “Act” phase (Koehler & Pankowski, 1996), which will be discussed in the Conclusion.


Moukhliss worked with one test pilot and interviewed ten students for this study. The ten students’ majors were representative of the following: Nursing, Computer Science, Communications, Public Administration, Electrical Engineering, Information Technology, Health Sciences, Philosophy, and Criminal Justice. Participants included two first-year students, two sophomores, three juniors, two seniors, and one graduate student. Eight participants used their desktops, whereas two completed the study on their phones. When evaluating the familiarity of users with library guides, one participant noted they had never used a library guide before, two others stated they rarely used them, and another two students stated that they occasionally used them. Finally, five students stated they did not know whether they had ever used one or not. 

Findings & Discussion

Overall, students were faster at navigating the Fake News 2 Revised guide vs. the original guide except for listing the 5 Types of Fake News. This may be because the 5 Types of Fake News were listed on the original guide’s first page. The overall successful mean navigability for the original guide was 39 seconds, whereas the revised guide’s mean was 22.2 seconds for the successful completion of a task. Moukhliss noted a pattern of failed completion tasks often linked back to poorly labeled sections of the new and revised guides. 

Although the content was identical in both guides except for the removal of outdated information, dead website links from the original guide, and the updated list of e-books to the revised guide, the students’ overall mean confidence level indicated 4.2 for the original guide’s information vs a 4.4 for the revised guide. The mean recommendation likelihood level for the original guide is 6.4, whereas the mean recommendation likelihood level of the revised guide increased to 7.9.

Regarding library guide personal preferences for a course reading, one student indicated they would want to work off the old guide, and 9 others indicated wanting to work from the revised guide for the following reasons:

  • Organization and layout are more effective.
  • Information is presented more clearly.
  • There is a tab for dedicated UNF resources.
  • Easier to navigate.
  • Less jumbled
  • Easier to follow when working with peers.

Regarding perceptions of which guide a professor may choose to teach with, three chose the original guide, whereas the other seven indicated the revised guide. One student stated that the old guide was more straightforward and that the instructor could explain the guide if they were using it during direct instruction. Preferences for the revised guide include:

  • More “interactive-ness” and quizzes
  • Summaries are present.
  • Presentation of content is better.
  • Locating information is easier.
  • The guide doesn’t feel like “a massive run-on sentence.”
  • Ease for “flipping through the topics.”
  • Presence of library consult and contact information. 

Although not part of the interview questions, Moukhliss was able to document that eight participants were not aware that a library guide could be embedded into a Canvas course, and one participant was aware. Moukhliss is unaware of the other participant’s experiences with an embedded library guide. Regarding preferences for embedding the library guide in Canvas, one student voted for the old guide whereas nine preferred the revised guide. Remarks for the revised guide include the inclusion of necessary Get Help information for struggling students and for the guide’s ease of navigation. 

Although not every standard from the LGAS rubric was brought up in conversation throughout the student interviews, the LGAS that were seen as positive and appreciated by students to integrate into a guide’s design include the following Standards: 4, 7, 11, 12, 15, 21, 22, 28, 30, and standards 31. It was noted through action that two students navigated the revised guide by the hyperlinked learning objectives and not by side navigation (Standard 5), thus indicating that Standard 5 may hold value for those who maneuvered the guide through the stated objectives. Moukhliss noted during her observations that one limitation to hyperlinking the object to a specific library guide page is when that page includes a tabbed box. The library guide author is unable to directly link to a specific sub-tab from the box. Instead, the link defaults to the first page of the box’s information. Thus, students maneuvering the guide expected to find the listed objective on the first tab of the tabbed box, and they did not innately click through the sub-tabs to discover the listed objective.

Through observation, Moukhliss noted that six students struggled to understand how to initially navigate the library guides using the side navigation functionality, but after time with the guide and/or Moukhliss educating them on side navigation, they were successful. Moukhliss noted that for students who were comfortable with navigating a guide or after Moukhliss educated them on navigating the guide, students preferred the sub-tabbed boxes of the revised guide to the organization of the original guide. The students found neither library guide perfect, but Moukhliss & McCowan noted there was an overall theme that organization of information and proper sequencing and chunking of the information was perceived as important by the students. Three students commented on appreciating clarification for each part of the guide, which provides leverage for proposed Standard 28.

Additionally, two students appreciated the library guide author profile picture and contact information on each page and three students positively remarked on the presence of a Get Help tab on the revised guide. One participant stated that professors want students to have a point of contact with their library liaisons, and they do not like “anonymous pages” (referring to the original guide lacking an author profile). The final participant wanted to see more consult widgets listed throughout the library guide. Regarding the Fake News 2 Guide, two students preferred that more content information and less information about getting started be present on the first page of the guide. Furthermore, images and design mattered, as one student remarked that they did not like the Fake News 2 banner, and several others disliked the lack of imagery on the first page of the Fake News 2 guide. For both guides, students consistently remarked on liking the Fake News infographics. 

Those supporting the old guide or parts of the original guide, three students liked the CRAAP Test worksheet and wanted to see it in the revised guide, not knowing that the worksheet was deemed dated by members of the instruction team and thus removed by Moukhliss for that reason. One student wanted to see the CRAAP test worksheet repurposed to be a flowchart regarding fake news. Moukhliss noted that most of the students perceived objects listed on the original guide and revised guide to be current, relevant, and vetted. Eight participants did not question their usefulness or relevancy or whether the library guide author was maintaining the guide. Only one student pointed out that the old guide had a list of outdated e-books and that the list was refreshed for the new guide. Thus, Moukhliss’s observations may reinforce to library guide authors that library guides should be reviewed and refreshed regularly as proposed by Standard 22 —⎯ as most students from this study appeared to take for face value that what is presented on the guide to be not only vetted but continuously maintained.

Initial data from this study indicate that using the LGAS rubric with annotations and a peer review process may improve the learning experience for students, especially in relation to being mindful of what information to include in a library guide, as well as the sequencing and chunking of the information. Early data indicates students appreciate a Get Help section and want to see Contact Us and library liaison/author information throughout the guide’s pages. 

Because six students initially struggled with maneuvering through a guide, Moukhliss & McCowan suggest including instructions on how to navigate in either the library guide banner and/or a brief introductory video for the Start Here page or both locations. Here is a screenshot of sample banner instructions:

A sample Fake News library guide banner being used to point students to how to maneuver the guide. Banner states: "Navigate this guide by selecting the tabs." And "Some pages of this guide include subtags to click into."

As stated, Moukhliss noted that most students were not aware of the presence of library guides in their Canvas courses. This may indicate that librarians should provide direct instruction during one-shots in not only what library guides are and how to maneuver them, but directly model how to access an embedded guide within Canvas. 


Library guides have considerable pedagogical potential. However, there are no widely-used rubrics for evaluating whether a particular library guide has design features that support its intended learning outcomes. Based on this study, librarians who adopt or adapt the LGAS rubric will be more likely to design library guides that support students’ ability to complete relevant tasks. At UNF, Moukhliss and McCowan plan to suggest to administration to employ the LGAS rubric and annotations with a peer review process and to consider templatizing their institution’s (UNF) library guides to honor the proposed standard that was deemed most impactful by the student participants. This includes recommending to library administration to include a Get Started tab for guide template(s) and to include placeholders for introductory text, a library guide navigation video tutorial, visual navigational instructions embedded in the guide’s banner, and the inclusion of the guide’s learning objectives. Furthermore, they propose an institutionally vetted Get Help tab that can be mapped to each guide. Other proposals include templatizing each page to include the following: a page synopsis, applicable explanations for accessing library-specific resources and tools from the library’s homepage, placeholders for general contact information, a link to the library liaison directory, a placeholder for the author bio picture, feedback, assessment, and a research consultation link or widget as well as instructions for accessing the library’s homepage.

Following the creation of a standardized template, Moukhliss plans to propose to recruit a team of volunteer peer reviewers (library staff, librarians, library administration) and provide training on the LGAS rubric, the annotations, and the peer review process. She will recommend all library guide authors to train on the proposed LGAS rubric and the new library guide template for future library guide authorship projects and for updating and improving existing guides based on the standards. The training will cover the rubric, the annotations, and the maintenance calendar checklists for monthly, bi-annually, and yearly review. All proposed training materials are available at the LGAS’s Start Here page

Moukhliss and McCowan encourage other college and university librarians to consider using or remixing the proposed LGAS rubric for a quality-checked review and to conduct studies on students’ perceptions of the rubric to add data to this research. The authors suggest future studies to survey both students and faculty on their perspectives on using a quality assurance rubric and peer review process to increase the pedagogical value of a library guide. Moukhliss & McCowan encourage future authors of studies to report on their successes and struggles for forming and training library colleagues on using a quality-checked rubric for library guide design and the peer review process.


The authors would like to express our gratitude to Kelly Lindberg and Ryan Randall, our peer reviewers. As well, we would like to thank the staff at In The Library with the Lead Pipe, including our publishing editor, Jaena Rae Cabrera.


Adebonojo, L. G. (2010). LibGuides: customizing subject guides for individual courses. College & Undergraduate Libraries, 17(4), 398–412. https://doi.org/10.1080/10691316.2010.525426  

Allen, M. (2017). Designing online asynchronous information literacy instruction using the ADDIE model. In T. Maddison & M. Kumaran (Eds.), Distributed learning pedagogy and technology in online information literacy instruction (pp.69-90). Chandos Publishing.

Bagshaw, A. & Yorke-Barber, P. (2018). Guiding librarians: Rethinking library guides as a staff development tool links to an external site. Journal of the Australian Library and Information Association67(1), 31–41. https://doi.org/10.1080/24750158.2017.1410629

Baker, R. L. (2014). Designing LibGuides as instructional tools for critical thinking and effective online learning. Journal of Library and Information Services in Distance Learning, 8(3–4), 107–117. https://doi.org/10.1080/1533290X.2014.944423 

Bergstrom-Lynch. (2019). LibGuides by design: Using instructional design principles and user-centered studies to develop best practices. Public Services Quarterly, 15(3), 205–223. https://doi.org/10.1080/15228959.2019.1632245

Berić-Stojšić, & Dubicki, E. (2016). Guiding students’ learning with LibGuides as an interactive teaching tool in health promotion. Pedagogy in Health Promotion, 2(2), 144–148. https://doi.org/10.1177/2373379915625324

Bielat, V., Befus, R., & Arnold, J. (2013). Integrating LibGuides into the teaching-learning process. In A. Dobbs, R. L. Sittler, & D. Cook (Eds.). Using LibGuides to enhance library services: A LITA guide (pp. 121-142). ALA TechSource.

Brewer, L., Rick, H., & Grondin, K. A. (2017). Improving digital libraries and support with online research guides. Online Learning Journal, 21(3), 135-150. http://dx.doi.org/10.24059/olj.v21i3.1237

Buck, S., & Valentino, M. L. (2018). OER and social justice: A colloquium at Oregon State University. Journal of Librarianship and Scholarly Communication, 6(2). https://doi.org/10.7710/2162-3309.2231

CIRT. (n. d.) Online Course Design Quality Review. https://www.unf.edu/cirt/id-Quality-Review.html

 Clever, K. A. (2020). Connecting with faculty and students through course-related LibGuides. Pennsylvania Libraries, 8(1), 49–57. https://doi.org/10.5195/palrap.2020.215

Conrad, S. & Stevens, C. (2019). “Am I on the library website?: A LibGuides usability study. Information Technology and Libraries, 38(3), 49-81. https://doi.org/10.6017/ital.v38i3.10977

 Coombs, B. (2015). LibGuides 2. Journal of the Medical Library Association, 103(1), 64–65. https://doi.org/10.3163/1536-5050.103.1.020

Cuevas-Cerveró, A., Colmenero-Ruiz, M.-J., & Martínez-Ávila, D. (2023). Critical information literacy as a form of information activism. The Journal of Academic Librarianship, 49(6), 102786. https://doi.org/10.1016/j.acalib.2023.102786

Dotson, D. S. (2021). LibGuides Gone Viral: A Giant LibGuides Project during Remote Working. Science & Technology Libraries (New York, N.Y.)40(3), 243–259. https://doi.org/10.1080/0194262X.2021.1884169

Downey, A. (2016). Critical information literacy: Foundations, inspiration, and ideas. Library Juice Press.

Emanuel, J. (2013). A short history of LibraryGuides and their usefulness to librarians and patrons. In A. Dobbs, R. L. Sittler, & D. Cook (Eds.). Using LibGuides to enhance library services: A LITA guide (pp. 3-20). ALA TechSource.

Gephart, R. P., Jr. (2004). Qualitative research and academy of management journal. Academy of Management Journal, 47(4), 452–462. https://doi.org/10.5465.amj.2004.14438580

German, E. (2017). Information literacy and instruction: LibGuides for instruction: A service design point of view from an academic library. Reference & User Services Quarterly, 56(3), 162-167. https://doi.org/10.5860/rusq.56n3.162

German, E., Grassian, E., & LeMire, S. (2017). LibGuides for instruction: A service design point of view from an academic library. Reference and User Services Quarterly, 56(3), 162–167. https://doi.org/10.5860/rusq.56n3.162

Guest, G., Bunce, A., & Johnson, L. (2006). How many interviews are enough? An experiment with data saturation and variability. Field Methods, 18, 59–82. doi:10.1177/1525822X05279903

Halupa, C. (2019). Differentiation of roles: Instructional designers and faculty in the creation of online courses. International Journal of Higher Education, 8(1), 55–68. https://doi.org/10.5430/ijhe.v8n1p55

Hare, S., & Evanson, C. (2018). Information privilege outreach for undergraduate students. College & Research Libraries, 79(6), 726–736. https://doi.org/10.5860/crl.79.6.726

Hennink, M., & Kaiser, B., (2019). Saturation in qualitative research, In P. Atkinson, S. Delamont, A. Cernat, J.W. Sakshaug, & R.A. Williams (Eds.), SAGE Research Methods Foundations. https://doi.org/10.4135/9781526421036822322

Hulen, K. (2022). Quality assurance drives continuous improvements to online programs. In S. Kumar & P. Arnold (Eds.), Quality in online programs: Approaches and practices in higher education. (pp. 3-22). The Netherlands: Brill. https://doi.org/10.1163/9789004510852_001 

Kathuria, H., & Becker, D. W. (2021). Leveraging course quality checklist to improve online courses. Journal of Teaching and Learning with Technology, 10(1) https://doi.org/10.14434/jotlt.v10i1.31253 

Kincheloe, J. (2012). Critical pedagogy in the twenty-first century: Evolution for survival. Counterpoints, 422, 147–183.

Koehler, J. W. & Pankowski, J. M. (1996). Continual improvement in government tools & methods. St. Lucie Press.

Lauseng, D. L., Howard, C., Scoulas, J. M., & Berry, A. (2021). Assessing online library guide use and Open Educational Resource (OER) potential: An evidence-based decision-making approach. Journal of Web Librarianship, 15(3), 128–153. https://doi.org/10.1080/19322909.2021.1935396

Lechtenberg, U. & Gold, H. (2022). When all you have is a hammer, everything looks like a LibGuide: Strengths, limitations, and opportunities of the teaching tool [Conference presentation]. LOEX 2022 Conference, Ypsilanti, MI, United States.  https://vimeo.com/721358576 

 Lee, Hayden, K. A., Ganshorn, H., & Pethrick, H. (2021). A content analysis of systematic review online library guides. Evidence Based Library and Information Practice, 16(1), 60–77. https://doi.org/10.18438/eblip29819

Lee, Y. Y., & Lowe, M. S. (2018). Building positive learning experiences through pedagogical research guide design. Journal of Web Librarianship, 12(4), 205-231. https://doi.org/10.1080/19322909.2018.1499453

Mann, B. J., Arnold, J. L., and Rawson, J. (2013). Using LibGuides to promote information literacy in a distance education environment. In A. Dobbs, R. L. Sittler, & D. Cook (Eds.). Using LibGuides to enhance library services: A LITA guide (pp. 221-238). ALA TechSource. 

May, D. & Leighton, H. V. (2013). Using a library-based course page to improve research skills in an undergraduate international business law course. Journal of Legal Studies Education, 30(2), 295-319. doi: 10.11n/jlse.12003

OSCQR – Suny Online Course Quality Review Rubric (n. d.). About OSCQR. https://oscqr.suny.edu/

Ouellette, D. (2011). Subject guides in academic libraries: A user-centred study of uses and perceptions. Canadian Journal of Information and Library Science, 35(4), 436–451.10.1353/ils.2011.0024 

Pickens, & Witte, G. (2015). Circle the wagons & bust out the big guns! Tame the “Wild West” of distance librarianship using Quality Matters TM Benchmarks. Journal of Library & Information Services in Distance Learning, 9(1-2), 119–132. https://doi.org/10.1080/1533290X.2014.946352

Quintel, D. F. (2016, January/February). LibGuides and usability: What our users want. Computers in Libraries Magazine, 36(1), 4-8. 

Smith, E. S., Koziura, A., Meinke, E., & Meszaros, E. (2023). Designing and implementing an instructional triptych for a digital future. The Journal of Academic Librarianship, 49(2), 102672–106277. https://doi.org/10.1016/j.acalib.2023.102672

Sonsteby, A. & DeJonghe, J. (2013). Usability testing, user-centered design, and LibGuides subject guides: A case study. Journal of Web Librarianship, 7(1), 83–94. https://doi.org/10.1080/19322909.2013.747366

SpringShare (n. d.). LibGuides. https://springshare.com/libguides/

Stone, S. M., Sara Lowe, M., & Maxson, B. K. (2018). Does course guide design impact student learning? College & Undergraduate Libraries, 25(3), 280-296. https://doi.org/10.1080/10691316.2018.1482808

Tewell, E. (2015). A decade of critical information literacy: A review of the literature. Comminfolit, 9(1), 24-43. https://doi.org/10.15760/comminfolit.2015.9.1.174

Unal, Z. & Unal, A. (2016). Students Matter: Quality Measurements in Online Courses. International Journal on E-Learning, 15(4), 463-481. Waynesville, NC USA: Association for the Advancement of Computing in Education (AACE). Retrieved September 21, 2023 from https://www.learntechlib.org/primary/p/147317/.

Wakeham, M., Roberts, A., Shelley, J. & Wells, P. (2012). Library subject guides: A case study of evidence-informed library development. Journal of Librarianship and Information Science, 44(3), 199-207. https://doi.org/10.1177/0961000611434757 

Leave a Reply