Note that all sessions will be held in the Weldon Community room (128), which is on the main floor of the D.B. Weldon Library (1151 Richmond St, London, ON). Meals will be served in the Weldon Staff Lounge on the 5th floor, accessible via the elevators (Once on the 5th floor, turn left to access the staff area). Breaks will be held in Room 121 (across and to the right of the Community Room).
Monday, October 23
Mark Robertson (Toronto Metropolitan University), Justine Wheeler (University of Calgary), Tania Gottschalk (Thompson Rivers University) This workshop provides an overview of the CARL Library Impact Framework and provides hands-on opportunities to apply logic models for the purposes of assessment. The CARL Library Impact Framework has taken the idea of impact pathways and used logic models as a way to visualize the arc of influence of our libraries’ programs, resources, and services. Logic models are predominantly used for the purposes of program planning and evaluation in the social services, government, and not-for-profit sectors. They have been used less frequently in the academic library context. The logic model framework has the benefit of providing a structured approach, delineating input from output from impact, and aiding us in describing the different forms impact can take. Discussion will focus on potential use of logic models, as well as, possible limitations and challenges. Participants will take away an understanding of the CARL Library Impact Framework, logic models, and how they may be applied in assessment work. Links to online resource material will also be provided. No prior knowledge is necessary to participate. Learning Activity: The facilitators will work with the participants to create an assessment question and then work through a logic model. The facilitators will also lead a discussion of the Framework’s uses and challenges.
Jenny Hirst (University of Waterloo) The University of Waterloo Library, like many others, has been through significant upheaval in terms of data, analysis, and assessment over the past few years. Starting off with an ILS migration at the end of 2019, and combining with the demand for all the data, all the time, in all the ways possible (reporting, data visuals, storytelling) throughout the pandemic, while continuing to adapt and evolve our services for our patrons. While wonderful work was coming out during this period, there was a downside: there was no time for maintenance, no time for documentation, no time for further assessment, and so forth. It was simply not a sustainable approach for our library going forward. Now that the dust has started to settle, we’re making the time to create an analytics roadmap. Changes in the landscape continue to call for a better understanding of our reporting and analytics needs, practices, and resourcing. We need to know where we need to be going in the next few years, sooner rather than later, and so we’ve set out on this journey to define our analytics and reporting vision, the gaps, and how to get there. Our roadmap consists of three phases: current state (what we’re going), future state (what we should be doing), and the roadmap itself (how we are going to get there). We have recently completed the first phase of the roadmap, which included a detailed data audit across all library departments, with focus on collection methodology, storage, access, retention policies, privacy, and purpose. We decided to do a series of interviews and discussions to gather this information, as opposed to a survey or data request, which we found very fruitful. A high-level table of this information, with contact information of key point library staff to increase transparency of our assessment work as the year goes on, will be hosted on our analytics wiki and kept up to date. From this data audit, we were also able to flag potential data privacy concerns, provide lower-level recommendations mainly related to data entry/storage best practices, and determine key themes that will help focus future phases for the roadmap. In this presentation, we would like to take the opportunity to share our findings and to present our methodology in detail, in the hopes that this work might help other libraries that are trying to find their feet again after a whirlwind of rapid assessment changes.
Shelley Arvin (Cunningham Memorial Library, Indiana State University) In response to post-pandemic changes and the current economy, while a university foundational studies program proceeded to restructure itself and its program, an Assessment Plan for Skills Applied Learning Requirements (SALRs) was initiated. The plan, which included information literacy, critical thinking, and developmental writing as SALRs, started with the Information Literacy SALR as a pilot case. Based on lessons learned from the pilot, the foundational studies assessment team intended to refine the plan and make final recommendations for a complete SALR assessment. A multi-pronged approach was utilized for information literacy assessment. First, the librarian representative on the foundational studies council developed a LibWizard quiz that assessed the composition and communications knowledge and skills of information literacy of freshmen students near the start of their college career. Second, student artifacts from foundational studies composition and communication courses were collaboratively assessed by professors and librarians using rubrics which included information-literacy-related outcomes. Third, librarians were asked to propose competency levels for information literacy at the freshmen, sophomore, junior, senior levels using assessable statements. Results of the first and second assessments were analyzed and critically discussed and shared with constituencies and stakeholders. The third initiative was used as a foundation for future conversations for assessment-friendly instruction and collaborations with course instructors. Attendees will learn:
Giovanna Badia (McGill University Library) Counts are commonly collected to determine library usage, such as the number of items borrowed, questions answered, in-person visits, etc. Counts also have some characteristics that violate assumptions of using linear regression models or statistical techniques to investigate relationships between variables. For example, counts cannot have values below zero whereas linear regression models can predict negative values. Poisson and negative binomial, including their zero-truncated and zero-inflated variants, are regression models that consider the properties of count data. This workshop will provide an overview of these count regression models, including what they are, how they differ from linear regression, and how they can be used to analyze library usage data. Criteria for how to compare statistical models to identify the best one to answer the question being investigated will also be presented. R and Stata code for running count regression models will be shared with workshop attendees. Learning activities – Participants will be asked to: Learning objectives – Participants will be able to: This workshop assumes that participants have used linear regression or have a general understanding of what it does.
Andrea McLellan, Jo-Anne Petropoulos, Stephanie Sanger, and Wei Zhang (McMaster University) In the Fall of 2022, McMaster University signed an agreement to participate in PLoS’ Community Action Publishing (CAP) program. In signing this agreement, the libraries aimed to increase McMaster’s research publishing output, particularly for researchers and graduate students in areas of unfunded research, by bolstering support for Article Progressing Charges (APCs). The PLoS deal was attractive because the journals are Open Access and have the potential to increase the institutional research impact. Our Open Access Assessment Interest Group selected PLoS as a case example for developing a framework for assessing the impact of Open Access Publishing agreements on the acquisition budget, APC costs, institutional publishing output, and research impact. The group conducted a pre-agreement assessment of the PLoS deal to determine potential benefits to the institution. We combined institutional trend data with publisher APC data, and with our knowledge of the research landscape at McMaster University, to draft an assessment framework that will guide our upcoming Open Access Publishing assessments. Participants in this session will gain an understanding of how to use Incites and other tools to generate reports on institutional publishing trends. They will also have an opportunity to reflect on how institutional strategic goals can guide assessment frameworks. Participants will leave this session with tangible guidance on how to select and prioritize data to monitor for the long-term assessment of Open Access Publishing agreements.
Sabina Pagotto (Scholars Portal / Ontario Council of University Libraries) We know that convenience is central for information-seeking behaviour. This is particularly true in academic libraries, where undergraduate students are unlikely to go out of their way in order to access a particular electronic resource, when a different “good enough” resource is easier to access (see, for example, Lynn Sillipigni Connaway, Timothy J. Dickey, and Marie L. Radford, “If it is too inconvenient I’m not going after it: Convenience as a critical factor in information-seeking behaviors,” Library & Information Research 33, no. 3 (2011): 179-190). In practice, this means that changes to the way a user interacts with an e-resource platform, or even a discovery tool, can impact COUNTER and other usage statistics. It’s not that the content itself is any more (or less) useful; it’s that accessing it has become more (or less) convenient. In this quick session, I will present the results of two case studies where changing interfaces resulted in changes in usage patterns, and thus usage statistics: a change to the Scholars Portal Books platform that made DRM-protected books easier to read and therefore increased their usage, and a change in subscribing libraries’ link resolving methods that impacted their usage of the Scholars Portal Journals platform. Participants will learn to: Above all, I hope participants will come away with the understanding that e-resource usage statistics aren’t only about collection assessment—they’re also about user experience.
Ze’ev Schneider (University of Ottawa) This presentation will cover various aspects of a collection assessment project undertaken by the University of Ottawa Library’s Collection Strategy team as part of a collection lifecycle management initiative. The project, which started in winter 2023 (analysis now complete; implementation ongoing) was an assessment of the value of local print journal holdings also having full-text coverage in JSTOR collections owned by the library. Approximately 600 titles in the local collection were identified in the overlap analysis. While the majority of titles were selected for deaccessioning following the assessment, approximately 40 titles were identified for retention in print based both on the relevance of the subject matter to the local collection and the results of comparing print content to digitized JSTOR versions in specific cases. The project also includes a collaboration component with the Center for Research Libraries (CRL), in an effort to supplement the CRL JSTOR print archive (available to all member libraries) through the donation of volumes missing from that archive and selected for deaccessioning from the local collection. This session will cover the team-based approach to the assessment and an overview of the decision-making process; review various data points considered, including image density and sample comparisons of the print to the digitized journal versions; review some of the technical aspects involved; and note some of the practical challenges in implementing this cross-functional project work. Practical learning objectives:
Cam Laforest (University of Alberta) The University of Alberta Library is a large research library, and member of CRKN. In 2020, CRKN subscribed to Unsub, a collection analysis tool designed to help librarians assess the value of ‘big deal’ journal packages. This presentation will describe how the Unsub tool combines list price, usage, citation, and publication data, to produce a model that allows users to project likely savings as well as the loss of access resulting from the cancellation of a ‘big deal’ journal package in favour of individual subscriptions. This presentation will discuss how Collection Strategies Librarians at the University of Alberta have used Unsub in 2021 and 2022 to assess several ‘big deal’ journal subscription packages. Several scenarios using data from the University of Alberta will be shared and discussed in detail, as a means of demonstrating the utility of the Unsub model as an assessment tool. This presentation will conclude with a discussion of observations about ‘big deal’ journal package subscriptions at the University of Alberta which are facilitated by the Unsub tool, and a discussion of the advantages and limitations of using Unsub for journal package assessment more generally. Participants will learn:
Jack Young (McMaster University) This session will introduce a novel approach to bibliometric analysis to uncover the distinctive features of highly cited research for use in the assessment of library collections. Utilizing popular citation-based tools (Web of Science & InCites), the presentation will explore how factors such as publication source, collaboration data, and research topics can be analyzed to assess the impact of current library collections and make evidence-informed decisions around collection development. This type of analysis meets the needs of stakeholders across the institution. In addition to supporting collection development activities within the library, this data has supported the work of Marketing & Public Relations, University Advancement, and individual research groups at the presenter’s home institution. By engaging in this type of assessment, libraries position themselves at the centre of the growing field of data-driven research strategy, opening new opportunities for collaboration and increasing their value to the institution. In addition to providing foundational knowledge on the key concepts underlying research impact assessment, this session will focus on building actionable reports for a variety of assessment purposes. Attendees will learn to:
Shawn Mitchell (Toronto Public Library) Join us for an inspiring and insightful keynote session as we delve into the realm of reimagining public libraries’ value and impact assessment. In an era marked by rapid technological advancements and evolving community needs, traditional methods of measuring library success no longer suffice. This session will spotlight the ground-breaking strides taken by the Toronto Public Library, showcasing how public libraries can harness new approaches and innovative tools to effectively determine, measure, and communicate their value.
Klara Maidenberg & Naz Torabi (University of Toronto Libraries) In the summer of 2020, the University of Toronto Libraries launched the process of creating an assessment framework to capture and track the organization’s efforts to advance the goals expressed in its 2020-2025 Strategic Plan. The goal of this project was to create a comprehensive and broad framework that would capture all the areas of library activity covered by the strategic plan. The project also aimed to embed assessment as a formal component of the annual work-planning process that departments and committees carry out and therefore, further enhance the libraries’ culture of assessment. After reviewing existing best practices and literature in this area, we proceeded to develop our own approach to creating this framework, aiming to create a structure that would outlive the lifespan of the current strategic plan and would remain a viable part of organizational practices going forward. This presentation will summarize our journey towards establishing and implementing the framework over the past three years. Participants should expect to learn the following:
Greg Davis (Iowa State University) This session will provide an overview of the Research Library Impact Framework (RLIF) initiative and the 18 projects that were included. The session will include a presentation by the Western Libraries team who will discuss their RLIF project, including their findings and next steps, along with time for Q&A. This will be followed by a learning activity designed to disseminate information from the RLIF project intended to help participants identify potential areas for research back at their library that would build on the ARL work. The session objectives are:
Shahira Khair and Samantha Macfarlane (University of Victoria) During the COVID-19pandemic and resulting closure of the University of Victoria campus, our student body keenly felt the loss of our physical library spaces. While we received positive feedback from the community about how well the Libraries adapted to providing services to our users working remotely (e.g., curbside pick-up of print materials, virtual workshops and reference, etc.), we also heard that users, and students in particular, missed the places themselves, not just the services offered within them. Inspired by these stories of personal connection between students and the library as a significant place, we sought to lead a research study to capture in their own words how students perceive and relate to the University of Victoria Libraries’ physical spaces. This presentation summarizes the process and findings of our qualitative research study on the relationship between students and the physical spaces of the University of Victoria Libraries. User narratives about their perceptions and relationships to the physical library were captured from 42 user interviews. Ambient sounds of library life around our physical environment were also recorded. Thematic analysis of the resulting dataset is ongoing and we expect will inform a more personal understanding of the range of users the Libraries serve and the needs that the physical space fills, which go beyond our more standard measures. Academic libraries are key physical spaces supporting learning and research, but also relaxation, leisure, and socialization. Throughout their university experience, the students we spoke with had formed very personal and unique relationships to the Libraries and described a sense of ownership over particular spaces they enjoy spending time in. The Libraries’ role in community building is very strong, with many interviewees describing friendships and support networks fostered and nurtured by the spaces. With this data, we intend to build online and in-person auditory exhibits to illustrate the value and impact of the Libraries’ physical spaces and to inspire our community of its potential for connection, inclusion, and creativity. This knowledge will also be leveraged in future initiatives to improve services and provide more welcoming and inclusive physical spaces for users. As libraries grapple with space and service planning in a hybrid-learning environment, the insights that we will share in this presentation can help inform a more holistic understanding of the impact of library spaces on student experience in a post COVID world. Learning Outcomes:
Kathy Ball & Leeanne Romane (McMaster University) In our roles, where we gather and report instruction statistics, we encountered discrepancies and inconsistencies with understanding and tracking instruction statistics for the CARL Statistical Survey. These internal inconsistencies led us to investigate our own practices and to explore the practices of other Canadian academic libraries. Through the use of two short surveys to library staff responsible for instruction statistics, as identified on their websites, we investigated the measures and methods Canadian academic libraries use to track library instruction activity, which traditionally has been based on two metrics: number of sessions and number of participants. We have also conducted interviews with interested participants to further explore this topic, including barriers and challenges to statistical keeping and the use of statistics as impact or success factors. We were also interested in how or to what extent these statistics are used in the library for planning and assessment purposes. This presentation will provide a summary of our findings and the recommendations that we are making to improve our process here at McMaster Library. Learning outcomes: Participants will learn about the challenges of tracking instruction statistics and possible approaches and methods to improve the process.
Robin Bergart, Juliene McLaughlin, and Ron Ward (University of Guelph) Come with us on a journey as we try to assess the past twenty years of assessment strategy and operation at our library. During this period, there have been multiple grassroots assessment committees, senior management-led mandates, frameworks, roadmaps, plans, and priorities, as well as individual efforts. We’ve tried out logic models, introduced UX into the mix, and configured three versions of a library-wide assessment team. How do all these experiments and iterations tell a story about the place of assessment at our library? Do they reflect the trends in the evolution of assessment in Canadian libraries more widely? We will explore whether there is a “best place” for assessment to sit within our organization, who should be setting the direction and scope, who is responsible for the work, and ultimately, does assessment matter and for whom? We hope that by reflecting on this journey and sharing these reflections, we will prompt you likewise to reflect on where assessment sits in your own library. We want to hear other stories of the journey of finding a home for assessment. Is there a best place to situate assessment, or is it contingent upon the particular organizational environment?
Fred Chan (Mount Royal University Library) Library practitioners deal with data daily. In many cases, we report descriptive statistics such as totals, percentages, and means to summarize our data. Inferential statistics, which are a set of methods that allow us to make estimates and draw conclusions about a population based on a sample of data, are less commonly applied. Inferential statistics techniques could enable us to draw more insights from our data to improve decision-making, enhance workflows, and allocate resources. This workshop offers library practitioners a practical and simple (i.e., no complex formula) way to learn about the applications of inferential statistics. It covers three forms of analyses: proportions, means, and associations, which offer many possibilities to gain insights from library data, such as: The workshop will use jamovi, an open-source software based on R, which offers a user-friendly graphical user interface, a variety of statistical analyses, data visualization capabilities, and powerful plugins. By incorporating inferential statistics into their analysis toolkit, practitioners can uncover valuable insights into user behaviors, preferences, and perceptions. Through hands-on experience with a real, library-related dataset, participants can further apply their learnings and make data-driven decisions in their work. By the end of this workshop, participants will be able to: Course materials, including refreshers on basic descriptive statistics, will be made available to participants before the workshop. To maximize learning outcomes, participants are encouraged to review the materials before the workshop.
Katherine McColgan (Canadian Association of Research Libraries) The annual collection of Canadian library statistics is a critical process for understanding the state of libraries and their impact on society. To ensure the continued relevance of these statistics, it is important to regularly review and update the survey. This presentation will outline the changes made to the annual collection of library statistics survey, the collection interface, and discuss the implications of these changes. The workshop presentation and discussions will cover: Learning activity: Participants will have an opportunity to share their experiences completing the new statistics questionnaire, engage in discussions on data collection workflows and timing, and help better define and expand the definitions for more consistency in reporting.
Joshua J. Herter (University of Winnipeg) Annual reporting methods can be frustrating. Even once you decide what to include or produce, they often require a dubious amount of work given their short lifespan, limited reach, and inscrutable impact. Conveying our contributions to teaching and research is an important task, but does the production of a curated marketing package really constitute engagement with campus stakeholders? Does anyone actually read your data dashboard? If they do, is it worth the risk that they misinterpret the information? This is the tale of one Assessment Librarian’s struggle to develop meaningful and genuine reporting tools that resonate with the administrative team’s use cases. After years of failed attempts, we accidentally landed on a strategy that works – library data stories. Rather than produce a static artifact – that few ever read – we focus on maintaining bite-sized narratives that frame what we’re doing rather than what we did. Based around a focused issue, project, or resource, these narratives are regularly updated and evolve over time. They serve two main functions: 1) To keep library administration equipped with live, actionable data to better understand the trajectory of our activities, and 2) Maintain a reliable set of “have-ready” storytelling assets. We find that this method facilitates the transfer of knowledge between library department heads, library administration and campus stakeholders, and better positions the library as an active member of the campus community. Learning Objectives:
Tara Stephens-Kyte (University of British Columbia) How an institutional repository communicates its value to key stakeholders should align with the interests and goals of their appropriate audience. But what does that look like in practice? And what if the audience isn’t really interested? Drawing on the experience of cIRcle, the institutional repository at the University of British Columbia, I will break down the elements of our annual Impact and Activity Report and its key metrics to demonstrate how we combine quantitative and qualitative data to shape narratives about our contributions to supporting teaching and research at UBC and our role in increasing its visibility. I will discuss the value of reflecting on, celebrating, and openly sharing our successes and key partnerships in a rapidly evolving cultural and scholarly landscape despite receiving no direct feedback or measurable engagement from our intended audiences of faculty, students, and librarians. I will provide an overview of the familiar tools we use (such as Google Analytics and Tableau) and discuss how the scope and focus of the report continually shifts as we define and redefine internal performance indicators, benchmarks, and measures of success. In the spirit of the H.G. Parry quote, “Something has to be heard to be ignored,” I place the cIRcle report in the broader context of long-term advocacy goals in the continued shift toward open scholarship. Attendees will leave the session with examples of adaptable approaches to conceptualizing and articulating the value of an IR and practical strategies for aligning assessment reporting practices with outreach, advocacy, and strategic planning goals. Amy McLay Paterson (Thompson Rivers University) A process map is a diagrammed workflow created collaboratively, usually by all members of a team involved in the process. Ornat and Moorefield (2018) state that by “depicting a complex process (including its actors and stakeholders), maps can help pass down, share, or communicate institutional knowledge. They are particularly valuable in aiding cross functional collaboration between different library departments or units.” Benefits of process mapping include: This proposed workshop will provide an introduction to process mapping, including what is process mapping, potential benefits to the participants and their library, and where process mapping might be used. The workshop will include time for attendees to try their hands at participating in a process mapping session. Participants are expected to be new to process mapping with no prior experience. Learning objectives: Participants in this session will be able to: Description of the learning activity: Participants will divide into small groups, and each group will be given a simple process presumed to be common in all libraries (ex. checking out a book). One person in each group will play the role of facilitator. While in a real-world situation, all participants will be members of the same library, it will be usual even within the same team for members to have slightly different understandings of the steps and order of simple processes. For the purpose of the activity, each participant will be given a role to play (ex. Librarian, Library technician), as well as a short description of the way this process looks from their role’s point of view. The goal of the activity will be for all participants on the team to work together to create a map of the process by incorporating the viewpoints and roles of all participating team members. This will include keeping to the scope of the process, determining the process start and end, defining any ways the process might branch off, and smoothing any misunderstandings between team members. At the end of the activity, teams will be invited to share the maps that they made with the larger group and to discuss any significant successes or challenges experienced during the activity. Gordon Coleman (Simon Fraser University Library) This presentation will discuss a project to re-envision assessment activities in the Resource Acquisition, Management, and Metadata (RAMM) division, the technical services unit of Simon Fraser University Library. The standard “monthly report” of statistics had not been updated in many years. It was clear the report was counting no-longer-important functions and ignoring important new categories of activity. This session will explore the process by which assessment was re-envisioned. Stages included: identification of goals and benefits and principles, design of new reports via Alma Analytics and other tools, iterative engagement with stakeholders to confirm their needs were being met, and design of new communication tools to share the results of assessment work. Through the session participants will be challenged to consider many typical assessment problems as they manifest within the technical services context. What is the right balance between quantitative vs. narrative and qualitative measures? Between routine stats vs. ad hoc project assessment? Is it true that “if you don’t count it, it doesn’t count?” How do we move from gathering statistics to assessing how our work supports unit and library goals? How can assessment be a driver of change? The ultimate goal of the project is to ensure we can convey the value of tech services work to our library colleagues, bring hidden work to light, and support the case for more resources and tools.
8:00 AM - 8:50 AM – Breakfast and Registration
8:50 AM - 9:00 AM – Welcome from CLAW Program Committee and Western Libraries
9:00 AM - 11:00 AM – CARL Library Impact Framework & Logic Models
11:00 AM - 11:20 AM – Break
11:20 AM - 11:40 AM – Oh, the places we could go! : Our Journey towards an Analytics Roadmap
11:40 AM - 12:00 AM – Piloting a Beginning College Assessment of Information Literacy for the Foundational Studies Program
12:00 PM - 1:00 PM – Lunch
1:00 PM - 2:00 PM – Using count regression models to analyze library usage: A practical overview
2:00 PM - 2:20 PM – Assessment in Open Access Publishing: A Focus on APCs
2:20 PM - 2:40 PM – Break
2:40 PM - 3:00 PM – Usage Statistics in Times of Technological Disruption: Case Studies of Changes in Interface Design and Usage Patterns at Scholars Portal
3:00 PM - 3:20 PM – It was bound to happen: a print journals deaccessioning project
3:20 PM - 3:40 PM – ‘Big Deal’ Journal Package Assessment using Unsub at the University of Alberta
3:40 PM - 4:00 PM – Uncovering Impact: A Bibliometric Approach to Assessing Library Collections
4:30 PM – Reception
Tuesday, October 24
8:00 AM - 9:00 AM – Breakfast
9:00 AM - 10:00 AM – Keynote: Unlocking Value: Innovative Approaches for Measuring the Impact of Public Libraries
10:00 AM - 10:20 AM – Charting a Course: How Assessment Can Help your Library Achieve its Strategic Goals
10:20 AM - 10:40 AM – Break
10:40 AM - 11:40 AM – Research Library Impact Framework (RLIF) initiative
11:40 AM - 12:00 PM – The Library as Place - The role of library spaces on student experience in a post-COVID world
12:00 PM - 1:00 PM –Lunch
1:00 PM - 1:20 PM – Library Instruction Statistics: Let me count the ways ….
1:20 PM - 1:40 PM – Finding a Home for Assessment
1:40 PM - 2:00 PM – Break
2:00 PM - 3:30 PM – Practical Workshop: Analyzing Library Data with jamovi
Wednesday, October 25
8:00 AM - 9:00 AM – Breakfast
9:00 AM - 10:00 AM – Annual Collection of Library Statistics: Changes to the CARL Survey and Survey Tool
10:00 AM - 10:20 AM – “Letting Go of the Annual Report: Iterative Data Storytelling at the UWinnipeg Library”
10:20 AM - 10:40 AM – Break
10:40 AM - 11:00 AM – “Something has to be heard to be ignored”: Reporting and reception of institutional repository impact metrics
11:00 AM - 12:00 PM – Process mapping for fun and profit* (or Intro to Process Mapping) *no actual profit
12:00 PM - 1:00 PM – Lunch
1:00 PM - 1:20 PM – Re-envisioning Assessment in a Technical Services Unit
1:20 PM - 2:50 PM – Unconference