October 27, 2025 - October 28, 2025
The Canadian Association of Research Libraries and the University of Regina’s Dr. John Archer Library and Archives invite you to participate in the Canadian Library Assessment Workshop (CLAW) taking place October 27 and 28, 2025 in Regina, Saskatchewan.
The workshop is designed to interest a range of academic and research libraries engaged in assessment at various levels of experience. The program will consist of a series of case studies and practical workshops providing attendees with methods, tools, and techniques to bring into their everyday assessment practice.
Presentations will be delivered in English; slides and supporting materials will be translated into French and made available in advance to support bilingualism.
A block of rooms with The Atlas Hotel has been arranged at a rate of $149.95 per night. Reservations can be made by calling 1-306-586-3443 or emailing g and using code: 102625CLA
University of Regina campus map
Archer Library is denoted as building code= LY
Food and drink in Regina
Note that dine-arounds will be organized in advance of the workshop.
Register before September 26th by completing this form and see payment details below.
Fees: $430.00 CAD [Note that this fee is waived for speakers, who are only required to complete the registration form linked above.]
Registration fees include breakfast, snacks, and lunch. An evening reception is also included and generously provided by the University of Regina’s Dr. John Archer Library and Archives on October 27. Please let us know of any dietary restrictions when you register.
Participants preferring to pay by e-transfer can direct their funds to . Participants preferring to pay via credit card or PayPal can do so using the link below.
| Time | Session | Speaker(s) |
|---|---|---|
| 8:00 AM – 8:45 AM | Breakfast and Registration | — |
| 8:45 AM – 9:00 AM | Welcome | Dr. David Gregory, Provost and Vice-President (Academic), University of Regina & CLAW Program Committee |
| 9:00 AM – 9:20 AM | Impact Compass for Research Libraries: The University of Calgary Experience | Justine Wheeler & Mary-Jo Romaniuk (University of Calgary) |
| 9:25 AM – 9:45 AM | Communicating the Value of a Library Consortium to Non-Library Audiences | Sabina Pagotto (Scholars Portal) & Katrina Fortner (OCUL) |
| 9:50 AM – 10:10 AM | Strategic Planning with Impact: A Logic Model Case Study from the University of Ottawa Library | Katrine Mallan (University of Ottawa) |
| 10:15 AM – 10:35 AM | From Scattered to Strategic: Developing Real-Time Dashboards for Library Assessment | Sam Vettraino, Suzy Lee, & Sarah Mantz (Western University) |
| 10:35 AM – 11:00 AM | Break | — |
| 11:00 AM – 11:20 AM | Going Beyond the Gate Counts: Library Census Day | Sarah Coysh & Sheril Hook (York University) |
| 11:25 AM – 11:45 AM | Assessing Project Success in Academic Libraries | Jessica Lange & Nailisa Tanner (McGill University) |
| 11:45 AM – 1:00 PM | Lunch | — |
| 1:00 PM – 1:20 PM | It Ain’t Broke... Or Is It? Facing Ambiguity and Uncertainty in Assessment | Juliene McLaughlin & Robin Bergart (University of Guelph) |
| 1:25 PM – 1:45 PM | Enhancing User Experience: Applying a Student-Focused Approach to Library Website Design | Ali Foster, Abigail Graham, & David Pearson (MacEwan University) |
| 1:45 PM – 2:10 PM | Break | — |
| 2:10 PM – 4:10 PM | Dialogue in Library Assessment: Facilitation, Approaches, and Techniques | Julie Jones, Ali Moore, Jennifer Zerkee, & Adair Harper (Simon Fraser University) |
| 4:30 PM | Reception | — |
| Time | Session | Speaker(s) |
|---|---|---|
| 8:00 AM – 8:50 AM | Breakfast | — |
| 8:50 AM – 9:00 AM | Welcome / Announcements | — |
| 9:00 AM – 10:30 AM | Ten Recent Assessment Projects that We Love | Justine Wheeler (University of Calgary) & Christine Brown (University of Alberta) |
| 10:30 AM – 11:00 AM | Break | — |
| 11:00 AM – 11:30 AM | Lightning Talks: Beyond Bugs; Testing for Usability and Accessibility; Asking Chat about Library Search | Caitlin Bakker; Caitlin Bakker & Christopher Read (University of Regina); Susan Bond (University of Toronto) |
| 11:30 AM – 1:00 PM | Lunch | — |
| 1:00 PM – 1:20 PM | Evaluating the Pedagogical Effectiveness of In-class Simulations in a Faculty Partnership | Michelle Goodridge (Wilfrid Laurier University) |
| 1:25 PM – 1:45 PM | Are Three Shots Better Than One? Assessing the Impact and Design of a New Information Literacy Program | Amy McLay Paterson (Thompson Rivers University) |
| 1:50 PM – 2:10 PM | Measuring What Matters: Assessing the Impact of Information Literacy Training Using Quantitative Methods and Analytics | Rong Luo, Shuzhen Zhao, and Karen Pillon (University of Windsor) |
| 2:10 PM – 2:30 PM | Ask the Users What They Want: Assessing How Researchers Find and Use Digital Collections in Their Work | Sarah Severson (University of Alberta) & Nailisa Tanner (McGill University) |
| 2:30 PM – 2:45 PM | Break | — |
| 2:45 PM – 3:05 PM | Peer-led IL Teaching Assessment: Cultivating a Culture of Reflective Assessment Practice Through Peer-Mentorship | Jody Nelson & Alison Pitcher (MacEwan University) |
| 3:10 PM – 3:40 PM | From Chaos to Clarity: Preparing Open Access Publication Data for Collection Assessment | Erin Calhoun (University of Toronto) |
| 3:45 PM – 4:05 PM | “Surprise Me!”: What a User Community Survey Revealed About an Open Stacks Collection | Ze’ev Schneider (Library of Parliament) |
Justine Wheeler & Mary-Jo Romaniuk (University of Calgary)
The presenters will describe the Libraries & Cultural Resources (LCR) Impact Study at the University of Calgary. This study focuses on the experience of two distinct user groups: faculty and students. It was conducted in close collaboration with Implement Consulting Group.
The purpose of this study is to better understand LCR’s role and impact in the academic lives of students and faculty through capturing and measuring what happens when these users engage with LCR. Additionally, the study aims to inspire and inform future strategic decisions and plans for Libraries and Cultural Resources.
The study is based on an Impact Compass framework, which is a mixed methods approach designed to explore how cultural experiences impact individuals. The Compass serves as a tool for quantitatively communicating how an individual experiences a particular cultural product or service, while also using qualitative data analysis to gain deeper insight into the findings.
While the Impact Compass framework has previously been used for public libraries, the framework had never been used in a research library context. To ensure the Impact Compass resonated in LCR’s context staff, students, and faculty were consulted, and significant adaptations to the Compass were made.
Specifically, through this collaborative process four pillars of library experience were determined: space, collections and digital resources, teaching and learning support, and research support. Each pillar was assigned an impact compass. Each compass was divided into four dimensions of impact: emotional, intellectual, social, and creative. Each dimension consisted of three parameters that were tested and refined to reflect the needs and context of a research library. Qualitative data was then used to provide nuance and understanding to the compass. From this impact profiles were developed.
The presenters will discuss the results from the study, including the impact profiles for both students and faculty. Finally, the presenters will discuss how the findings of the study have been used and plans for future use.
Learning Objectives:
Sabina Pagotto (Scholars Portal) and Katrina Fortner (Ontario Council of University Libraries).
In late 2024, the library directors that make up the Ontario Council of University Libraries (OCUL) asked for support in explaining the value of the services they receive from OCUL and its service arm Scholars Portal, to senior university administrators outside of the library. With more than 20 institutions varying significantly in size, research intensity, and institutional priorities, there was no consensus about what kinds of information would be helpful for communicating to senior administrators. Some institutions, for example, preferred hard numbers and specifically asked about return on investment or cost savings; others felt that statements about qualitative benefits would be more compelling arguments with their administrators.
In response to this request, the OCUL-Scholars Portal communications group launched into action. We investigated various ways of calculating value and cost savings in different services or areas of operations, until we identified a range of data points we felt confident calculating for every institution. We also wrote value statements articulating the benefits or outcomes of consortial services that we expected to resonate with non-library administrators. Once we had a strong set of statements, we asked library directors to rank the statements they felt would be most helpful for their institution. With the help of a template designed by our web services team, we then generated custom reports for every institution that responded. These reports were distributed along with supporting documentation that provided more context to allow library directors to confidently discuss these reports with administrators at their institution, connecting the dots between library cost or resource savings and positive campus community impacts.
This session will discuss the challenges, opportunities, and lessons learned throughout the project.
Participants will learn how to:
Katrine Mallan (University of Ottawa)
This session will share how the University of Ottawa Library is using the CARL Library Impact Framework to bring greater clarity, coherence, and evaluability to the implementation of our new Strategic Plan.
Using one strategic direction as a case study, I will walk participants through how our Library Management Group collaboratively developed a logic model to map key activities, outputs, audiences, outcomes, and underlying assumptions. This process helped us move beyond broad aspirations to define tangible actions, test the plausibility of the impact pathway, and lay the groundwork for meaningful evaluation.
By grounding our planning in the CARL Framework, we built a shared understanding of how our work creates impact and established a clearer line of sight between strategy and implementation.
The session will highlight how the use of logic models:
This case study is particularly relevant for libraries seeking to bridge the gap between strategic planning and operational execution, and for those aiming to integrate evaluative thinking without overwhelming staff or resources.
Learning Objectives:
By the end of the session, participants will be able to:
This session is designed to be practical and transferable, especially for libraries navigating the messy middle between strategy design and execution. Our case study is intended to spark ideas for libraries seeking to embed evaluation into their planning practices and make strategic intent more actionable and visible.
Sam Vettraino, Suzy Lee, and Sarah Mantz (Western University)
Academic libraries collect a wide array of data through various platforms, tools, and procedures. However, staff and leadership often find it difficult to access or interpret that data. At Western Libraries, we had a common challenge. The data was hard to find, housed in different platforms, inconsistently tracked amongst our teams, and difficult to connect for evidence-based decision making at all levels.
To help with this, we are developing a real-time dashboard that would promote transparency and accessibility to our data. This dashboard’s aim is to support evidence-based decision making across the entire organization to help with strategic planning, resource allocation, and operations.
The project began with consulting with leadership to determine the scope of the project and define key goals. This was followed by a series of internal data inventory meetings with key stakeholders from across our User Services and Senior Leadership teams to understand what data was being collected, how, and by whom. These meetings revealed gaps in data collection, along with key differences in how all the teams view and use their data. In conjunction, we also worked with members of our LITS (Library Information Technology Services) team to assess the technical requirements, explore visualization options, and identify what would be the most sustainable options, long term.
In this session, we will share our planning process, lessons learned, and practical steps we have used to map and consolidate our data across a large library system. We will also discuss how we framed our process using ADKAR (Awareness, Desire, Knowledge, Ability, Reinforcement) to support sustained change and contribute towards a culture of assessment at Western Libraries.
Learning Objectives / Key Takeaways
Sarah Coysh & Sheril Hook (York University)
Who spends time and how much time are they spending in our libraries? That is the question that drove us to conduct a Census Day at all of our library locations. We captured library traffic of 19,232 entrance/exits across the day. We selected seventeen data points to record, including present postal code, degree level, commuter or in residence, mother tongue, and age group among others. We particularly wanted to map the distance that students travelled and see if there was a correlation between the length of time they spent in the libraries. We were also curious to see if there was a correlation between the area of study and the library that an individual student chose. We noted that 96% of our patrons were undergraduate students with an average stay of 130 minutes each.
Like Dillalogue & Koehn (2020), we wanted to delve deeper than gate counts alone so we created a tap-in/tap-out tool for users to voluntarily tap their campus ID when entering and leaving all six of our libraries on a single day in 2025. We modeled this assessment project after McGill Libraries who have run a similar type of event in the past. Our goal was to capture 80% of the traffic in a single day during our opening hours. We also used Census Day as a catalyst for our LibQUAL+ survey for which we received more completed surveys than in previous years.
Two approaches were critical to the success of Census Day: involving the entire library staff and partnering with the Office of Institutional Planning & Analysis (OIPA). Staff were involved in establishing the id card scanning system; the logistics of the day for each library; role assignment on Census Day; developing and disseminating communication assets; celebrating the success of the day; assisting in preliminary analysis of the results). Partnering with OIPA allowed us to select from over 200 data points; maintain anonymity of the data; and provided us with language for describing the analysis within the constraints of protecting patron data.
We selected seventeen data points, including present postal code, degree level, commuter or in residence, mother tongue, and age group among others. We particularly wanted to map the distance that students travelled and see if there was a correlation between the length of time they spent in the libraries. We were also curious to see if there was a correlation between the area of study and the library that an individual student chose. We noted that 96% of our patrons were undergraduate students with an average stay of 130 minutes.
In this session we will share details of what we learned about who is using the libraries and how long they’re staying, including data that surprised us and follow up assessment ideas.
Attendees will learn about the variety of data points collected by academic institutions and how those can be useful to inform decisions about hours, staffing, space design, traffic flow etc.
Attendees will learn about an open-source dashboard system (Metabase) that allows querying of the data for data visualization.
Attendees will also receive a link to a packet with guidance on running their own Census Day.
Reference: Dillalogue, E., & Koehn, M. (2020). Assessment on a Dime: Low Cost User Data Collection for Assessment. Evidence Based Library and Information Practice, 15(3), 2–20. https://doi.org/10.18438/eblip29582
Jessica Lange & Nailisa Tanner (McGill University)
At the start of any project, project stakeholders define what constitutes success. This definition is then used at the end of the project to evaluate if these aims were met and ultimately determine if the project was “successful” or not. Typical metrics used in project management such as key performance indicators (KPIs) and resource utilization, though potentially useful, speak to businesses or for-profit workplaces and may not be the most relevant nor transferable to academic environments where profit is not a relevant factor.
After completing multiple projects in McGill Libraries’ technology unit and using typical metrics to evaluate projects, it became clear that though these metrics answered the “what” of a project, they did not answer the “how”. Projects could be completed on time and meet their objectives, but may have encountered significant issues during its course, such as poor planning, staff conflict, and rushed timelines. Defining success in such narrow terms left staff feeling dissatisfied, particularly if the project met significant challenges. Doing a retrospective on the project is one model to capture qualitative feedback, allowing team members to learn and grow from (sometimes) negative experiences. This can provide additional context outside of typical success measures.
This presentation’s learning objectives include:
Juliene McLaughlin & Robin Bergart (University of Guelph)
As the saying goes, “if it ain’t broke, don’t fix it.” But sometimes it’s not so clear when a library service is fine as it is and should be left alone, or if it is starting to break, or is already broken. For many years our library maintained a public webpage listing databases, and in 2016 we implemented Springshare’s A-Z database list. Over this time, our UX team has conducted usability studies, gathered web analytics, and monitored the library literature. We’ve recommended incremental changes to ensure this page is as usable and useful as possible. However, more recently we began to wonder if this page is still relevant and if it’s doing more harm than good. After a thorough UX investigation, we were unable to reach any definitive conclusion, but our intuition that there is a problem has not gone away. What do you do when your gut tells you there’s a problem, but the evidence is inconclusive? When should you abandon incremental improvements and make a radical change? Who decides?
Learning objectives:
Ali Foster, Abigail Graham, and David Pearson (MacEwan University)
This brief presentation highlights the experiences and lessons learned from a comprehensive redesign of an undergraduate library website that focused on enhancing the user experience for undergraduate students. The project team consisted of four library staff members: a graphic designer, an IT specialist, and two librarians. The main challenge the team faced was developing a user experience (UX) protocol that prioritized the perspectives of students while also garnering support from library staff for the proposed changes to our existing assessment practices.
Prior to the project’s launch in Fall 2023, previous website redesign efforts primarily considered the opinions of library staff and faculty. Consequently, the website’s information architecture and tone of voice were flagged as potential areas of improvement during the redesign process. This concern was validated when the team collaborated with upper-year anthropology students conducting ethnographic studies of the library website. In both discussions with the students and in their formal reports, it became clear that they found the website to be inaccessible, confusing, and unwelcoming to students. To that end, the project team developed a UX protocol that sought to prioritize student perspectives as well as best practices in accessible and inclusive web design.
This presentation will provide a brief overview of the project, key examples of student-focused user experience tools we applied, and our observations and lessons learned as we continue to assess and refine the library website.
Learning Outcomes – Participants will:
Julie Jones, Ali Moore, Jennifer Zerkee, and Adair Harper (Simon Fraser University)
Library assessment practitioners can integrate the tools and techniques of dialogue to facilitate meaningful and inclusive community-based assessments—with library users as well as library colleagues. This workshop will introduce theory, knowledge, and skills for facilitating dialogue in library assessment work.
Dialogue can be defined in many ways. It is a method for articulating needs, desires, and underlying assumptions in groups that rejects hierarchical power dynamics and is transformation focused. It uses participatory practices to host generative conversations, be more inclusive, foster mutual learning, centre lived expertise, and harness collective wisdom. It “can support groups and communities to respectfully explore polarizing issues, address conflict, deliberate on potential for complex issues and much more” (Morris J. Wosk Centre for Dialogue, Simon Fraser University, n.d.). In his 1999 book, Dialogue and the Art of Thinking Together, William Issacs succinctly and rather beautifully defined dialogue as “a conversation with a center, not sides” (p. 19). Williams went on to state, “It is a way of taking the energy of our differences and channeling it toward something that has never been created before. It lifts us out of polarization and into a greater common sense, and is thereby a means for accessing the intelligence and coordinated power of groups of people” (p. 19).
Attendees of this workshop will learn about how the Instruction Plan Sub-Group of SFU Library’s Working Group for Information Literacy & Instruction (WGILI) is using methods in dialogue to assess the needs of their library instructor colleagues and to integrate their colleagues’ lived expertise into the Library’s new Instruction Plan, which the Sub-Group is currently creating. After an introduction to theory and the SFU context, attendees will be invited to assume the role of participants in a structured dialogue created by the Sub-Group to assess the needs of library instructors at SFU Library. This dialogue will facilitate co-constructed knowledge, distributed leadership, the cross-pollination of ideas, and shared and visible sense-making.
After experiencing the structured dialogue as participants, attendees will be invited to assume the role of facilitators as we “pull back the curtain” and review and reflect on the components, design, and outcomes of the dialogue. We will share the theory behind decisions made and actions taken and discuss the transferability of the facilitation methods used, with an emphasis on the practical application of theoretical concepts and principles in other library assessment contexts. Templates and facilitation resources that attendees can bring into their assessment practice will be shared. We’ll also discuss the benefits of bringing methods in dialogue and facilitation into library assessment work.
Learning objectives
At the end of this workshop, attendees will be able to:
References
Isaacs, W. (1999). Dialogue and the art of thinking together: A pioneering approach to communicating in business and in life (1st ed). Currency.
Morris J. Wosk Centre for Dialogue, Simon Fraser University. (n.d.). What is Dialogue? Retrieved June 1, 2025, from https://www.sfu.ca/dialogue/learn/what-is-dialogue.html
Justine Wheeler (University of Calgary) & Christine Brown (University of Alberta)
The assessment landscape is changing. There is an ongoing shift to a focus on value and impact, an increase in qualitative and mixed-method assessment, and nascent explorations into the role of generative artificial intelligence in assessment. In our current environment, it can be difficult to keep up with new initiatives and developments – especially with limited time, resources, and/or technology.
In this fun and fast-paced session, the presenters will share their recent assessment projects and practice, as well as, highlighting innovative practices at other libraries. While many of the projects discussed in the presentation will be from academic libraries, all of the projects can be applied in academic, public, government, and other library settings. Furthermore, examples will range from projects and practices that can be done with limited resources (time, money, technology), to more resource-intensive examples.
Projects and practices presented will vary in focus (collections, space, instruction etc.), methodology (point-of-use, predictive collections, photo voice etc.), time period (cross-sectional, longitudinal), and scope of impact (individual, institutional, societal).
Learning outcomes:
This session is aimed at librarians who want to discover fresh perspectives or methodologies in the field of library user experience, assessment, or impact practice. The goal of the session is for attendees to leave with a new approach to assessment that they can apply in their library.
Time will also be provided for audience members to share their own recent, exciting assessment initiative.
Beyond Bugs: Using a ‘Report a Problem’ Feature to Understand User Expectations – Caitlin Bakker (University of Regina)
In February 2023, the University of Regina implemented a report a problem feature in its discovery system. Since that time, more than 600 reports have been submitted, and this has become a critical feedback channel. Through these reports, we’ve gained insight into aspects of the search and discovery experience that patrons find confusing, counterintuitive, or frustrating. These user-submitted problem reports frequently highlight mismatches between system functionality and user needs and assumptions. By analyzing these reports, we’ve been able to better understand user behaviour and expectations of the library system, and of the library generally. This lightning talk will outline how these reports were analyzed, and the insights gained through this process.
Testing for usability and accessibility: Using a think-aloud protocol as an assessment tool for Open Journal Systems – Caitlin Bakker & Christopher Read (University of Regina)
In this lightning talk, the presenters will share their experience with usability and accessibility testing as part of the University of Regina’s launch of Proceedings of the Canadian Nuclear Society in Open Journal Systems (OJS). This lightning talk will focus on best practices and takeaways from using a think-aloud protocol to assess the collection and its search functionality. The primary goal of this testing was to enable diverse user communities to engage in effective information seeking, whether they are accessing the system via desktop computers or mobile devices. The session will cover practical aspects of the testing methodology, such as working with a remote Library practicum. Attendees can expect to learn about specific usability testing techniques including prompts, screen readers, recording, and other aspects of think aloud testing, and how they can provide critical insight into improving system design and user satisfaction.
Asking Chat about Library Search – Susan Bond (University of Toronto)
At the University of Toronto, we have in recent years added our embedded chat help tool to our discovery layer – Ex Libris Primo VE, which we have branded locally as LibrarySearch. Because LibrarySearch is a self-service tool, every chat launched represents a potential paint point – it’s a way of users letting us know they’re not able to self-serve. In early 2025 we undertook an exploratory assessment looking at one month’s worth of chat transcripts, seeing what we could discover about challenges in our interface, gaps in our documentation, and problems with our eresources. Our findings included evidence in favour of changes we were already working on, some quick wins, and some things we’ll need to think more about before we can address them. In this lightning talk we’ll look at our specific findings about our User Interface, our support document ad the staffing of our chat support, and touch on what we learned about our coding/assessment and how we will change it going forward.
Michelle Goodridge (Wilfrid Laurier University)
In 2017, a librarian-faculty partnership led to the creation of an in-class simulation for a first-year undergraduate course on global human rights. Students took on roles representing global actors (e.g., nation states and NGOs) to respond to a simulated crisis in their own role’s best interest, which led to developing research, evaluation, and critical thinking skills in an active learning environment. A mixed-methods research study assessed the impact of the simulation, combining qualitative data from reflections and observations with rubric-based quantitative analysis. This session will outline the simulation design, assessment strategies, and key findings, demonstrating how play-based learning can meaningfully enhance student engagement and skill development. Attendees will gain insights into applying these methods in other instructional settings and using data to advocate for deeper faculty-librarian collaboration.
Amy McLay Paterson (Thompson Rivers University)
In 2023-2024 Thompson Rivers University librarians developed a pilot program to transform an information literacy program from predominantly one-shots to a structured, integrated program. The English 1100 Library Instruction Pilot (ELIP) project was developed as a collaboration between the Library and the English department in Spring 2023 with the goal of expanding library instruction to support student success and student belonging. Rather than participate in the usual one-shot session, designated ELIP students will take part in a series of three weekly library tutorials. The ELIP project has ran throughout the 2023-24 academic year, during which time we collected student assignment data, student feedback, faculty feedback, and reflective journal entries of librarian participants to determine the effectiveness and sustainability of the program.
The shortcomings of one-shot information literacy sessions are widespread and well documented in our profession. Nicole Pagowsky states that “the one-shot—even if there is more than one—makes it difficult to reach deeper learning, critical thinking, and inclusive pedagogy” (2021, 2). However, our approach to the pilot—neither a one-shot nor a credit course—is unique within the established literature.
By each of these metrics, the ELIP program improved outcomes over the usual one-shot. Faculty survey results were very positive, and the librarian instructors all observed the depth of student engagement and positive relationship-building. This closer relationship with students also resulted in tangible improvements to our teaching practices and existing learning objects. My presentation will introduce the ELIP program, including the rationale for its design; introduce the various assessment methods we used; and discuss our results and next steps. Course materials, as well as assessment tools (such as surveys and consent forms) will be made available to interested attendees.
Rong Luo, Shuzhen Zhao, and Karen Pillon (University of Windsor)
As information literacy (IL) instruction continues to evolve in academic libraries, assessment practices must keep pace to ensure meaningful outcomes. This session explores the effectiveness of IL training by presenting a study conducted at the University of Windsor that assesses both perceived and demonstrated IL skills among graduate students in the Faculty of Arts, Humanities, and Social Sciences (FAHSS). Grounded in the ACRL’s Framework for Information Literacy for Higher Education and using a modified version of the Beile Test of Information Literacy for Education (B-TILED), the study highlights how analytics and structured assessment tools can inform, improve, and validate library instruction programs.
IL training takes various forms, from classroom-based sessions to orientation activities and one-on-one consultations, but its effectiveness often goes unexamined. By comparing students who received formal IL instruction with those who did not, this session reveals whether training correlates with stronger research skills and confidence in navigating academic resources.
Participants will learn about a flexible, scalable assessment approach using B-TILED, which combines demographic data, self-assessment of research abilities, and objective knowledge tests. The session also discusses practical considerations for survey design, implementation, and analysis using SPSS or similar platforms.
Learning Objectives:
By the end of this session, participants will be able to:
Sarah Severson (University of Alberta) & Nailisa Tanner (McGill University)
For over 20 years, libraries have been digitizing and providing online access to their collections to better serve the research, teaching, and creative aims of academics and non-academics alike. The open, online nature of most library-digitized collections has helped facilitate access for new audiences; however, this openness also means that libraries know comparatively little about those using these resources, making these resources difficult for libraries to assess. If you don’t know who your users are, how can you learn about them?
User experience research has gained popularity and widespread acceptance in libraries as a standard set of techniques for assessing library services and interfaces, and digital collections platforms are no exception here. However, much of the user experience research conducted for the purposes of assessing digital collections programs and platforms is limited to institutional affiliates and is concerned only with the digital collections platform itself, neglecting questions about how users find these collections in the first place as well as how they use them in their research.
This presentation reports on a mixed-methods study conducted with the aim of broadening our horizons about who digital collections users are and how they use library-digitized resources in their research. Our study used anonymous surveys and qualitative interviews to achieve a balance of breadth of participation and depth of answers. Recognizing the open nature of our digital collections, we included academic researchers regardless of location or institutional affiliation, as well as non-academic researchers like journalists, artists, bloggers, and public historians. Our study asked three main research questions: Who is using library-digitized collections? How do they find them? And finally, how do they use them in their research?
In this presentation we will focus on the methods we used for this study, their outcomes, and what we learned from this process. We will discuss the importance of using multiple types of research techniques for assessment in the context of user experience research, and in particular what this type of mixed-methods research can contribute to the suite of user experience research tools. Participants will learn about the strengths of mixed-methods research, how exploratory research techniques can be used to complement and enhance other evaluative data, like usability studies and web analytics, that may be gathering for library assessment, and how to apply these techniques to answer big questions in user experience and library service assessment.
Jody Nelson & Alison Pitcher (MacEwan University)
Our session will feature the peer-led information literacy [IL] teaching assessment program we facilitate for librarians at our undergraduate university, including the impetus, development, theoretical foundation, structure, and impact. We will share observations about how the program is contributing to our understanding of reflective teaching practice as central to teaching assessment and the development of a more cohesive community of teaching practice among instructional librarians involved in the program.
Our Reflective Peer-Mentorship Teaching Triangles program fulfills the Library’s mandate for developing sustainable IL teaching assessment practices in alignment with our University’s Strategic Plan, Teaching Greatness. Taking a student-centred approach to assessment and grounded in the critical assessment of library instruction framework developed by McCartin & Dineen (2018), our critical IL assessment program promotes the intentional, continuous evaluation of teaching cultivated through reflective practice. By placing the work of assessment and improvement on us as teachers we shift the assessment burden (i.e. filling out feedback forms or measurement instruments) off of students.
Building our library’s long-standing involvement with teaching observation triangles and IL teaching mentorship, our program offers ongoing teaching assessment opportunities in combination with targeted peer mentorship. Program participants form peer-mentorship triangles that meet multiple times across an academic term. Individual participants identify the priority teaching elements (i.e., content, accessibility, interactions with students) about which they would like to receive constructive feedback; peer observers adopt a strengths-based approach, sharing positive feedback alongside any useful suggestions for improvement in the pre-identified areas. As members of a co-mentorship triad, participants aim to foster teaching excellence in one another while also learning from one another, collecting ideas for enhancing student learning experiences as they observe one another teach. For early career librarians in particular, the co-mentorship aspect creates a safe environment for developing teaching competence and confidence. This program not only allows for meaningful assessment of teaching but also affords participants the opportunity to foster individual reflective practice in a way that is continual, sustained, and relational (Lewitzky, 2020).
Following our session, participants will leave with an understanding of:
References:
Lewitzky, R. A. (2020). Educating, learning, and growing: A review of the teaching role in academic librarianship. College & Undergraduate Libraries, 27(1), 32-40.
McCartin, L. F., & Dineen, R. (2018). Toward a Critical-Inclusive Assessment Practice for Library Instruction. Litwin Books.
Erin Calhoun (University of Toronto)
Developments in open access publishing, business models, and funding policies have brought scholarly communications into the fold of collection development work. Increasingly, library collection development funds are repurposed to cover publishing fees through transitional read and publish agreements. With this shift comes a pressing need for new assessment frameworks – ones that go beyond traditional e-resource usage statistics to measure publication output and financial investment. Libraries must develop capacities to gather and analyze institutional publication data to inform collection development decisions.
However, assessing open access publishing is not straightforward. Open access publication data is often presented in inconsistent formats and scattered across bibliometric and publisher reports, which hinders aggregated analysis. To address this challenge, The University of Toronto Libraries launched a pilot project aimed at streamlining the collection, cleaning, and visualization of OA publication data.
The short session will share insights from the pilot, which includes gathering publication data reports from several open access agreements, standardizing the data into a unified template, and building and interactive dashboard to visualize the university’s publishing activity and financial contributions in open access publishing. The project initially explored programming-based solutions for data cleaning but ultimately adopted KNIME – a low code, open-source platform – to manage complex data cleaning tasks in a scalable and accessible way.
From this presentation, participants will:
This presentation will be valuable for those interested in new assessment methods and data sources, data analytics tools, and those looking to begin working with open access publication data.
Ze’ev Schneider (Library of Parliament)
The Library of Parliament’s (Library) rich collections, developed over many decades, are central to its mission of contributing to Canadian parliamentary democracy by “managing and delivering authoritative, reliable and relevant information and knowledge.” The collections are distributed among several branches in the parliamentary precinct as well as an offsite storage facility.
In addition, a self-serve branch intended primarily for Library researchers provides on-site access to a print collection. A recent comprehensive assessment of this collection resulted in ample shelf space becoming available for new acquisitions. In considering how to develop this collection and manage the space, we reflected on the current role and value of on-site print for subject specialists in their work and research, professional development, and leisurely reading. This process led to creating the user survey covered in this presentation.
In this session, I will present the survey from conception to results.
The presentation will include:
Learning objectives
Conference participants will learn about:
Except where otherwise noted, this website is licensed under a
Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Some content may be created, edited, or translated with the assistance of artificial intelligence tools and is reviewed by CARL staff before publication.
Details and exceptions |
Cookie Policy |
Privacy Policy.
