Workshop Descriptions

Quick Links

Preconference Workshops
Keynote Address
1 Hour Classes
Quick Classes
Posters

Preconference Workshops

How to Use Apis and Data Science to Support Your Collections Work – Roger Reka (University of Windsor)

Librarians use data in order to support the decisions that they need to make to develop their collections. Knowing where researchers publish, who they cite, and other similar metrics help us identify which materials to purchase and which ones not to, and can give us a sense of what topics our researchers are studying. These data are readily available in databases that many of us subscribe to—such as the Scopus and Web of Science citation indexes—but we aren’t making good use of these resources to collect en mass these data, and make use of them efficiently. Large global information companies, such as Elsevier, recognize the value of this type of information in the application of library collections development and institutional research analytics. Elsevier recently purchased Montreal-based 1Science, which provides research analytics services for universities and libraries, and has been further developing and promoting their SciVal analytics platform. The analytics provided in these reports rely on the same data that is sourced from the citation indexes that we subscribe to, yet libraries often don’t have the skillset to produce these metrics on their own. Many of the databases and tools that libraries subscribe to offer APIs in order to access the data with machines, as opposed to the graphical user interface. APIs are accessed via scripts written in programming languages, in order to automate the work that we would traditionally do with the database interface. Learning how to access the APIs with accessible programming techniques will build the data science skills of librarians, which can be transferred into other areas of work.

This workshop will start with a brief overview of a bibliometrics approach, building off of the information presented by Vincent Larivière at previous CLAW workshops. Participants will discuss what data we should use to support our collections decisions and why, and what it can be used for. We will discuss what data sources are available to us, and their benefits and limitations. In a hands-on workshop led by the facilitator using Jupyter Notebooks, participants in this workshop will be introduced to APIs and will use Python to query data through the Scopus/Web of Science API. Using the downloaded data, participants will learn how to run descriptive statistics using the pandas module in Python, and to generate data visualizations that can be used in library reports and collections assessments. Participants will see how they can use Python programs to integrate with library systems, such as a link resolver, to automate their work. The workshop will end with references to micro-courses where participants can continue to learn the fundamentals of data science and programming. Learning outcomes:

  • Recognize different types of data sources for, and bibliometric approaches to collections assessment
  • Explain what APIs are and when to use them over graphical user interfaces
  • Use Python with Jupyter Notebooks to prepare descriptive statistics of the data
  • Describe how to integrate Python with existing library systems, such as a link resolver Q13.

Due to the hands-on nature of this workshop,  participants must bring a laptop with the Mac, Linux, or Windows operating system (not a tablet, Chromebook, etc.) that they have administrative privileges on. Participants must also pre-install the free Anaconda distribution (https://www.anaconda.com/distribution/) before the workshop. Please let Roger Reka () know if you have any questions or concerns about these requirements, or if you are unable to bring a laptop.

What counts and what can be counted – Fundamentals of electronic resources assessment – Klara Maidenberg (University of Toronto) & Eva Jurczyk (University of Toronto)

This workshop will expose participants to best practices for making acquisition, renewal, and cancellation decisions involving electronic resources.

As electronic resources claim a growing proportion of academic libraries’ collections budgets, librarians outside the electronic resources team are increasingly being asked to evaluate and make decisions about these resources.  The skills and approaches that are required for evaluating electronic content are different than those used with print collections, and there is a scarcity of professional development opportunities in this area. Where expertise in assessing collections exists, it is often limited to a small number of expert staff. The goal of this workshop is to enhance the capacity and confidence of librarians by providing practical tools and approaches that they can adopt as they engage in decision-making around electronic collections. Attendees of this workshop will develop their capacity to collect and analyze data related to their electronic resources. Participants will learn about qualitative and quantitative methods of electronic resource assessment and learn to understand the concepts and metrics that are frequently utilized to select, manage, and evaluate electronic resources, including sources of evidence, the structure of content packages and how that structure may affect collections decisions.

Participants will be presented with case studies and real data to assess electronic resources and will have an opportunity to apply their learning to hands-on assessment exercises. Participants will take away practical skills and tools that they can put to use in their own work environments. This workshop will be of interest to librarians engaged in collection development or assessment work.

Back to Top

 

Keynote Address

Through Tensions: Critical Conversations on Positionality and Power in Library Assessment – Ebony Magnus (Simon Fraser University), Maggie Faber (University of Washington), & Jackie Belanger (University of Washington)

In her keynote address at the 2017 Canadian Library Assessment Workshop, Karen Nicholson posed the question, “how might we engage critically with quality assurance and assessment to better align them with our professional values and the academic mission of the university?” (p3). With this session, we hope to open for consideration this question, among others, for attendees at CLAW 2019. In this session, we will invite attendees to engage reflectively and critically in nuanced discussions about the nature of power, bias, and positionality in library assessment work. Over the last two years, we have conducted research on critical methodologies employed in social sciences, data studies, and educational research, culminating in the publication of the article “Towards a Critical Assessment Practice” in In the Library with the Lead Pipe . In this work, we shared professional and personal experiences that led us to explore structures of power inherent in our assessment work and we posed a number of questions to readers with which we have grappled, including: – How do our own identities, institutional positions, and perspectives shape our work? – What is the purpose of the assessment, who decides what to assess, and who benefits from the work? – Are there elements of our institutional contexts (e.g., an emphasis on a culture of accountability) that create tension with the values we try to bring to our work? How might a more critical approach transform these approaches to assessment? – What are the histories and contexts of the methods we choose, and how do these shape our work? How can we take account of the histories and inequities of qualitative methods such as ethnography, even while these methods are often posited as an antidote to an overemphasis on quantitative assessment? – What is considered “evidence” and who decides? – Are we working in ways that enable power sharing and engagement with user communities at all stages of the process, from question formulation and data analysis, to decision-making? These are topics and questions with which we continue to grapple and which we think warrant a deeper engagement from the library assessment community itself.

This session aims to expand the current discussion of assessment in order to recognize and more effectively address issues of power, inclusion, and equity and inequality in various aspects of our practices. We imagine that many assessment practitioners may have experienced a conflict between institutional priority, administration expectations, student experiences, and methodological integrity. The presenters hope that this session will spark discussions about how assessment practitioners engage meaningfully with the potential tensions in this work. With this in mind, we would like to propose a modified session format (60-90 min) in which we draw on the tensions we have uncovered in our own work to facilitate a dialogue with conference attendees. We will describe the sites of greatest tension in our daily work – the practices in which systemic influence has become most apparent, yet can’t be entirely undone. We do not seek to offer packaged solutions, but will explore ways in which librarians might begin to interrogate bias and power in our assessment activities. We recently presented a similar session at CAPAL 2019 in which we framed similar questions to an audience largely made up of librarians engaged in critical librarianship, though we have not yet had the opportunity to engage our assessment peers in a similarly open discussion. It is our aim, with this facilitated dialogue, to

  • Enable assessment practitioners to reflect on their own positionality and institutional context and the ways in which that shapes our work throughout the assessment cycle
  • Examine underlying assumptions and power structures in current assessment practices
  • Explore other disciplines and alternative methodologies in order to critically consider ways of engaging user communities in assessment work

To accomplish this, we will structure the session to include sections of presented content, drawing from our research, coupled with guided discussion activities. Activities will include small-and large-group discussions, and we may utilize a range of tools to facilitate sharing out – including post-it notes, poster paper, and shared Google documents. We found this approach led to productive and collaborative dialogue during our session at CAPAL19, while allowing for some documentation of the topics discussed.

Back to Top

 

1 Hour Workshops

Beyond Metrics: Ethnography Lite for Librarians – David Michels (Dalhousie University)

Libraries measure things – collections, transactions, visits, downloads, and likes. New tools allow us to mine and visualize that data to demonstrate the impacts of our activities. We know that good metrics are important for planning, budgeting, and programming. But numbers need contexts, and data are most meaningful when connected to the stories of our clients, the people we serve. Since we regularly interact with our clients, we can mistakenly assume we know their stories. We just might be surprised by the stories they might tell if we actually asked. The goal of this session is to challenge and equip library staff to actively seek out those untold stories. In this interactive and hands-on session, we will explore ethnographic field methods that librarians can use to uncover the stories behind the numbers. Following a brief introduction to ethnography as a research methodology, we will do three things in this session:

  1. We will explore together our ethnographic toolkit – participant observations, focus groups, interviews, and writing projects,
  2. We will try out our tools with demonstrations and practice activities,
  3. We will discuss how we can empower our clients to share their library, research, and information stories.

As we proceed through each step there will be opportunities to learn about important books and resources that would be helpful for research planning. We will consider several examples of several good ethnographic research projects particular to libraries and information seeking. We will reflect on real life research stories from my own research, and the successes, failures, and lessons I have learned along the way. Participants will be given the opportunity to talk about their own context and opportunities within a small discussion group. At the end of this session, participants should be able to identify a potential research opportunity, understand the toolkit available to explore that opportunity, and feel inspired to pursue that research.

Choosing Your Assessment Method: Emily Christofides (University of Waterloo)

In assessing and improving library space and services, it can sometimes be difficult to know where to begin. While there are always many questions that could be asked, it is difficult to know how to go about gathering data to answer them. Specifically, how do you choose what methods to use to gather the information you need in order to move forward with decision making? This session will help you get started in planning your next assessment by describing some of the similarities and differences between common methods, explaining how to formulate your question to fit the appropriate method (or better yet, how to choose your method based on your question), and will provide an opportunity to try doing so in a supported environment. Do you:

  • Have a tried and true method (for example, surveys) that you are comfortable using but are not sure whether it can be applied to your current problem?
  • Know what you want to assess but are not certain how to go about it?
  • Have a regular assessment that you run without really knowing what you intend to do with the information.

If any of these situations apply to you, then you could benefit from understanding more about how to choose the best method for the question you want to answer. In this session, the facilitator will walk you through some of the key assessment methods used in library UX work with examples of how these methods have been used in a library setting. Methods covered include surveys, focus groups, interviews, usability testing, and observational methods. You will learn what each method involves (at a high level), the kind of data that you can gather, the types of questions it is suitable for answering, and hear about the advantages and disadvantages of each. We will also explore how the questions you ask impact the kinds of decisions you can make. In a hands-on activity, we will experiment with the match between research questions and methods to help you make better decisions when choosing an assessment method. Participants will be asked to come prepared to discuss one of their library assessment-related questions. This is a question that they are interested in answering about their users (not a research question). In small groups, users will work through their questions including aspects such as: Who are their users in this case? What do they want to know about them? What do they hope to do with that information? How will the information gathered impact decision-making? Participants will then be matched with a specific method to explore how the method chosen either limits or enhances the kind of information that can be gathered. As a group we will then discuss ways of better matching the methods with the questions that workshop participants want to answer. The session will have both informal lecture components and a hands-on activity components, with activity topics selected from participants’ own experience. Learning objectives include a basic understanding of the different assessment methods that are available to them, knowledge of how the question they seek to answer is impacted by the method of choice, and practical experience matching methods and questions in order to gather actionable insights.

Evaluating and Managing the Implementation of Your Strategic Plan – Maurini Strub (University of Rochester) & Lauren Di Monte (University of Rochester)

Organizations invest a great deal of resources into developing a strategic plan for it to frequently land on a shelf, file cabinet, or be electronically archived. Using a case study analysis, we will take a look at best practices for implementing and managing a strategic plan. At our institution we applied an outcomes-based assessment framework to create a required (but flexible) structure that has driven the projects that advance strategic goals. This framework is also present in project planning documents, and used to increase buy-in by creating a shared understanding of scope and success criteria. Finally, by establishing an assessment and communication plan for the implementation, we keep the strategic plan at the forefront of everyone’s minds with regular reflection, evaluation, reflection, and communication. In this session, we will also address tools utilized, challenges encountered, and some of the ways organizational resistance presented itself during the process. Using a case study analysis, attendees will learn how to:

  • Construct a structure or implementing a strategic plan
  • Develop a shared vocabulary for implementation
  • Manage perceptions of operational vs strategic work Identify and realign cultural mismatches
  • Recognize and manage push-back
  • Communication strategies
  • Manage and mitigate perceptions of failure Build in accountability in an implementation plan.

It’s the Circle of Life: Introducing Ecocycle Planning – Chloe Riley (Simon Fraser University)

This session will introduce Ecocycle Planning as a qualitative method for collaboratively assessing a collection or portfolio of activities in order to understand the work as a whole, and to pinpoint blockages and opportunities for renewal. Ecocycle Planning is one of a collection of facilitation techniques called Liberating Structures, which are designed to be inclusive and to disrupt conventional or stale practices of working in groups. Liberating Structures can be employed in any situation that involves people working together, and many of them can be used or adapted for everyday evaluation and assessment practices. In Ecocycle Planning, a team or group works together to identify their work activities, projects, and initiatives, and position them within the ecocycle (birth, maturity, creative destruction, renewal). The exercise facilitates the team’s explorations of how to balance activities, set priorities, and identify opportunities for freeing up resources. The structure enables participation from every member of the team, and lets group members see their own work in the context of the team as a whole. In this workshop, participants will learn the essential design elements that make Liberating Structures successful. They will also understand how to facilitate Ecocycle Planning to assess a team’s activities and programs, and to develop strategies for using it with their working groups, teams, units, and collaborators.

Obtuse, Acute, and Right: A Workshop on the Unexpected Benefits of Teaching Triangles – Christopher Popovich (University of Guelph)

Professional development (PD) and assessment are two critical features of academic libraries which require careful consideration and often considerable resources. Combining these two elements of professional practice offers a range of benefits beyond the laudable goals of developing more effective educators and better understanding our professional practice. Teaching Triangles, or Teaching Squares as they are more commonly described, consists of observing the teaching of one’s peers through the lens of reflective practice. By combining PD and assessment in a self-reflective collaborative program, librarians and library professionals have opportunities for growth through formative self-assessment in terms of greater self-awareness and thoughtfulness in both observing and teaching. There are also summative benefits for the Librarian through the Square Share where the observer formalizes and shares the process and self-assessment outcomes of the observation and self-reflection. The in-service implementation and uncritical format of the triads (or squares) offer shared experiences which promote team building and foster thoughtful engagement and pedagogical discussions throughout the semester. It is also relatively simple to implement, inexpensive to run, and offers economies of scale when expanding the program beyond the unit or Library. The goal of Teaching Triangle is not to assess or critically evaluate the teaching but rather to use the observation as a font of new ideas and approaches and to stimulate reflective self-assessment and growth within the observer. Teaching Triangles generally follow five steps:

  1. Kick off where the participants get together in an orientation/Info session and agree about guidelines and roles;
  2. A scheduling session to organize the teaching and observation rota;
  3. A pre observation meeting to exchange course material, outline learning outcomes, discuss the relevant students, faculty, etc., and to reaffirm the self-reflective purpose of the observation;
  4. The observation and written self-reflection
  5. Debrief (Square share) which explores: What did I take away as a teacher? What would I like to try? Would this approach/format work in my context? How will the experience inform my practice?

The benefits of teaching triangles are:

  • Light and quick and inexpensive (resource wise)
  • Not traditional quantitative assessment in that it is predicated on critical reflective practice and the exposure to alternate approaches and teaching styles
  • It is a trust-building exercise in an atmosphere of healthy skepticism for the neoliberal institution in which assessment can be complicit.

The unexpected benefits of teaching triangles are:

  • Cross-pollination of different fields with different student needs and different styles and levels of engagement
  • Librarians are exposed to new teaching techniques, tips, and tricks
  • Working out the parallels, confluences, and divergences in IL practice – Gaining a holistic view of teaching in the library and beyond Learning Objectives:
    1. By the end of the workshop attendees will be able to identify and discuss the elements of a Teaching Triangles self-assessment program
    2. By the end of the workshop attendees will be able to implement the Teaching Triangles self-assessment toolkit.
    3. By the end of the workshop attendees will be able to adapt and design a Teaching Triangles self-assessment program for their institution.

Project Outcome for Academic Libraries: Data for Impact and Improvement – Greg Davis (Iowa State University)

Attendees will learn about the new Project Outcome for Academic Libraries surveys and resources. Project Outcome is a free toolkit that helps libraries measure four key learning outcomes – knowledge, confidence, application, and awareness – across seven library program and service areas. The survey topics cover: Instruction, Events/Programs, Research, Teaching Support, Digital & Special Collections, Space, and Library Technology. The toolkit provides academic libraries of any size the means to easily measure outcomes and use that data as the basis for improvements and advocacy. This session will include opportunities for questions and discussion among participants. Optional: Prior to the workshop we recommend that participants register for Project Outcome, review basic materials in the toolkit, and consider a goal for outcome measurement at their library. Learning Outcomes Participants will:

  • Discover how Project Outcome can help academic libraries measure meaningful learning outcomes.
  • Learn how to use the Project Outcome for Academic Libraries toolkit, from administering surveys to visualizing results.
  • Understand how other libraries have used outcome data for action.
  • Discuss how to put data to work in improving library services and advocacy.

Back to Top

 

Quick Classes

Can Ux and Assessment Work Together? – Juliene McLaughlin (University of Guelph)

This is the short story of how 2 UX Librarians sought help to make UX work resonate more with colleagues. We interviewed 16 UX Librarians from across North America for answers and stumbled across fascinating and wildly differing relationships between UX and Assessment. Of course, several natural connections between UX and Assessment were highlighted, but we also heard of many tensions that cloud the picture. First, we will answer the question, do UX and Assessment make sense together? Then we will share results from our interviews that describe the factors that contribute to the acceptance of Assessment and UX results, the research methods that seem to be more trustworthy (and why is that?), and the interplay between UX, Assessment and the organizational culture and structure. We will then explore practical strategies to incorporate UX methods into an assessment program. Attendees will leave the presentation with:

  • An understanding of how UX and Assessment fit together
  • Examples of how various academic libraries have arranged UX and Assessment on the organizational chart
  • Practical ways to integrate UX into an assessment program.

Counter 5 Release Reports: A Provider Perspective – Sabina Pagotto (Scholars Portal)

Collections and e-resource departments at many academic libraries rely on COUNTER usage reports to understand the value their electronic subscriptions provide to library users. Release 5 of the COUNTER Code of Practice, which went into effect in early 2019, represents a significant change in the standard. While the new release is an important step towards normalizing usage data between different content types, it requires a major shift, technologically and conceptually. Libraries and content providers alike face a steep learning curve as they adjust to new reports, new metrics, and new vocabulary. Scholars Portal, the service arm of the Ontario Council of University Libraries, provides COUNTER-certified usage reports for our locally hosted Journals platform and is working to create COUNTER reports for the recently upgraded Scholars Portal Books platform. The process of reviewing and modifying the way our usage logs are transformed into standard views has left Scholars Portal staff with a deep understanding of COUNTER title reporting. This session will explain the concepts behind the major changes in COUNTER 5, describe the new COUNTER standard views for e-book and e-journal titles and how they differ from the previous release’s reports, and finally demonstrate how usage log data is transformed to make these reports standardized. By the end of the session, participants will:

  • Recognize the concepts behind COUNTER Release 5.
  • Understand the new metrics and report views, and how they can be used to analyze usage data.
  • Gain a deeper awareness of how usage data is generated and what technical limitations are possible.

Evolution of Big Deal Analysis – Jaclyn McLean (University of Saskatchewan) & Ken Ladd (University of Saskatchewan)

We first started our assessment of Big Deals in 2015. Since then we have updated and expanded our analysis of the data. From simple usage data and aggregate cost analysis in 2015 to detailed cost per title, breakdown of packages by college or discipline, and the addition of citation data from Web of Science, our program has developed and expanded over the years. We’ve also worked on simplifying and presenting the data and the story about the data to our liaison librarians, and this year have the “opportunity” to test our methodology with a large cancellation project, using our collected data as the foundation. We will describe how we aggregate data from different sources, provide tips and tricks for data management we’ve learned along the way, share the template we use to summarize the data about each Big Deal, and discuss the time investment required to complete these analyses. We will discuss some challenges we encountered through our analysis, and opportunities for further assessment (e.g., incorporating information about APCs paid to the publisher). Our assessment project has been informed by and continues to develop based on the experiences shared by others—by sharing practical information about how we started our assessment of big deals, we hope to help others get started, or adopt portions of the analysis that may be useful in their own contexts. We will also discuss our assessment plan looking forward to 2020 and how we plan to change what we’re doing in light of COUNTER 5 implementation.

I Meant to Get it Back On Time!  Countering Bias, Promoting Equality, and Improving Customer Service Through Qualitative Fine Data Analysis – Lisa Levesque (Ryerson University) & Kelly Kimberley (Ryerson University)

This workshop will describe a qualitative study undertaken at Ryerson University Library to understand why patrons incur fines. The better that we understand our patrons, the better service we can provide, and this workshop will describe how this study fits into the revision of the fine structure at Ryerson University Library to be more equitable and improve customer service interactions. Participants at this workshop will leave with a roadmap of how to explore fine data. They will learn about methods that reduce bias in their qualitative studies. They will learn about evidence that can be used to advocate for change to fine structures at their academic library, and why change is necessary. This qualitative study will be completed this summer. We anticipate that the results will confirm what we already know about fines. Through experience, we know that fines create negative customer service interactions. Enforcing them is unpleasant both for staff, who have to bear the brunt of patrons’ negative emotion, and patrons, who now have an unpleasant memory associated with the library. In reviewing the literature, we found that fines disproportionately affect those who are least able to pay them, making them inequitable and a barrier to library use. Fines also run contrary to other library services that aim to increase access to materials. This qualitative study builds off of the Fall 2018 quantitative analysis of fine data that was conducted to determine what types of fines were being incurred and the costs associated with them. As a result of this analysis, the decision was made to change the library fine structure in order to eliminate fines for overdue monographs free of holds or other restrictions. Running a complementary qualitative study will be instrumental in confirming this course of action. Due to the high emotions that they can cause, fines are a controversial topic. Librarians and library staff often have misconceptions about the effectiveness of fines and the reasons that patrons incur fines. This study gives patrons a voice for why they incurred fines, allowing library workers to overcome our assumptions. In order to reduce and keep in check our own biases, including those derived from prior research, we used grounded theory and critical assessment best practices when analyzing results. These included collaborative coding and the use of the constant comparative method, both methods that mitigate bias. We also included student feedback during the analysis process in order to further privilege patron voices. These methods ensure that our study results are grounded in evidence. While bias can never be totally eliminated, it can be controlled, reduced, and consciously accounted for. This study was undertaken using tools that will be available at most academic libraries. Borrowing and Lending services maintains an online Library Fines Appeals form, and the use of this pre-existing data eliminated the need for time-consuming data collection. It was analyzed using Google Sheets and the creation of a shared Google document codebook. Analysis was undertaken using a licensed Tableau subscription, software that is also available in a public, free format. The use of pre-existing data and free, commonly used tools allowed for less time spent on data collection or on learning software and more time to critically examine results. In person conversations formed an important part of the analysis process, allowing for the development of a shared understanding of emergent results. By describing the approach taken by Ryerson University Library to explore qualitative fine data, this workshop will cover methods to reduce bias in research and connect study results to service changes that improve customer service and reduce bias.

Once Upon a Research Consultation: Using Consultation Statistics to Tell Stories About relationship Building, Workload, and Organization Change – Amy McLay Paterson (Thompson Rivers University)

As the primary site of librarian-student interactions moves away from the central reference or service desk, research consultations become a vital benchmark for showcasing relationship building and tracking organizational change. However, many libraries track research consultations only as basic numbers and some only as a part of service desk statistics. In 2017, Thompson Rivers University Library simplified our service desk statistics form but separated out and expanded on our research consultation tracking procedures, adding metrics for liaison area, referral method, and amount of time spent (including prep time), among other metrics. Since that time, the Library has used this information as a group to assess workload distribution and to track and respond to changes in our service model. As individual librarians, we have used the information to fine tune our relationships and communications with liaison areas and to articulate our unique impact for tenure dossiers and annual reports. This simple change has led to a greater understanding about the current shape of our work but has also prompted us to contemplate bigger questions about how the nature of our work and our relationships with students is evolving. Participants in this session will: Identify and discuss factors that distinguish a research consultation from a reference interaction and explain the value of tracking consultations separately; Discuss how/if they are currently using their research consultation stats and contemplate how they would like to use them; Explore several metrics (in addition to numbers) that can be tracked in regard to research consultation and articulate the benefits of each; Recognize advantages and disadvantages of several access models for consultation numbers and identify factors that need to be considered when mediating access.

Should I Stay or Should I Go? Updating Journal Value Analytics to Go Beyond Cost Per Use – Jason Friedman (CRKN) & Émilie Lavallée-Funston (CRKN)

Compiled annually by the Canadian Research Knowledge Network (CRKN), the Journal Value Analytics (JVA) tool combines pricing, subject, and usage data to provide an information resource to members evaluating their participation in CRKN-licensed journal packages. What is unique to the CRKN JVA design are three features we intend to highlight in this session: the inclusion and separate identification of open access and paid content data, the inclusion of year of publication data, and the calculation of estimated costs per title. This quick class will share CRKN’s evolving JVA design, focusing on key, unique features that provide valuable insight for members. The flexible design allows members to feedback on their specific institutional needs and enable a more refined analysis in each subsequent year. While the 2019 JVA uses COUNTER 4 data, we will also describe how we intend to use COUNTER Release 5 data in the 2020 JVA. We hope participants will take away inspiration, tricks, and tools for performing analysis at their own institutions or across multiple institutions. Given the scope of completing this analysis for 75 institutions annually, we will discuss tactics used to streamline the process, ensure reliability of data, and validate the accuracy of analyses. In addition, we would like to engage with workshop participants on the design of the project; answering questions on design or analysis methods, looking ahead to COUNTER5 data, and other elements of this project.

To Be In Sync Oo Out of Sync: Considerations For Switching From LibQual to The Insync Survey – Linda Bedwell (Dalhousie University) & Laura Newton Miller (Carleton University)

Trying to figure out the needs of our users is a never-ending quest for academic libraries. In 2018-19, two Canadian university libraries made the decision to administer the Australian Insync survey rather than LibQual. The main purpose of the Insync survey is to give students, staff, and faculty the opportunity to state how well they believe the Library performs in relation to what they think is important. The survey focuses on performance and importance specifically related to communication, service delivery, facilities & equipment, information resources, and overall satisfaction. The strength of Insync’s more streamlined question structure outweighed the possible disadvantages of longitudinal and benchmarking data loss. Now that Carleton and Dalhousie are well into interpreting results reports and analyzing raw data, they are fully realizing the challenges and benefits of conducting the Insync survey. The experience of these two institutions should help to guide other institutions who are either just starting to think about conducting a survey or who are considering “making the switch” from LibQual to Insync. Learning Outcomes:

  • Attendees will acquire knowledge of the content and structure of the Insync Library Client Survey and will be able to apply a list of pros and cons of switching to Insync from LibQual for their own institution.
  • By learning from peer experience, they will be able to anticipate the amount of work involved with interpreting reports and analyzing raw data as well as the sorts of findings and potential follow-up actions that arise from the results.
  • These outcomes will enable attendees to make a recommendation to switch or not to switch to the Insync survey or other alternatives at their own academic libraries.

Who’s Afraid of the W Word: Tackling a Weeding Project – Sarah Simpkin (University of Ottawa) & Ingrid Moisil (University of Ottawa)

Traditionally, academic libraries were collecting materials without giving much thought to weeding. As collections grew bigger, less used collections were moved to compact shelving, then storage facilities, often off-site or sometimes shared. In contrast, weeding in public libraries was always integrated in the collection development process. In recent years academic libraries started weeding some of their collections and dozens of articles have been published on deselection of library materials. However, weeding is not yet considered a common task and many librarians feel reluctant to perform it or uncomfortable about opening a conversation with faculty about weeding. Drawing on the experience of the University of Ottawa Library, this workshop will cover the different stages of the deselection project:

  • establishing the need
  • defining the deselection criteria
  • integrating faculty feedback
  • creating the lists
  • communicating the project to the larger community
  • withdrawing the materials
  • disposing of the discarded materials
  • evaluating the project.

Participants will

  • understand the complexity of a weeding project
  • identify a deselection need in their own library
  •  develop their own weeding project.

Back to Top

 

Posters

Assessing The Evening/Weekend Library Ux – Emily Christofides (University of Waterloo)

The University of Waterloo Library has gathered feedback on signage, space, furniture, and other aspects of the library environment. But, much of this assessment work occurs during the day and library staff wondered how well it would apply to the experience of users in the evenings and on the weekends. We wanted to learn what types of support and information needs our users had, how well our available services were meeting their needs, and whether there were differences in how patrons use the library in the evenings and on weekends. We gathered information through desk observation, a focus group with staff, and a student survey. We learned that in our libraries, the students using the library in the evenings and on weekends were similar to those during the day. From what we observed and heard, we concluded that areas for improvement relate to students’ experiences using the library space (e.g. desire for more study space, better wifi, use of elevators, eating, and cleanliness). Similarly, staff issues relate to supporting use of the space (e.g. reporting a leak for repair) rather than supporting patrons in library-specific issues (though these did also occur). The staff at this time employed a problem-solving approach and found ways of helping users to the best of their abilities, only referring them on for further support in a minority of cases. Overall, we concluded that the evening and weekend user experience was not substantially different from the daytime experience, and that improvement efforts should focus on use and maintenance of the space.

Making Space for All: Evaluation in the UTSC Library Makerspace – Elizabeth O’Brien (University of Toronto)

Makerspaces in postsecondary education are spaces on campuses for students to find belonging, engagement and active learning opportunities. Many public and academic libraries are investing in spaces to help support maker cultures. The University of Toronto Scarborough (UTSC) Library’s Makerspace opened in September 2016 in a renovated library space situated prominently near the library’s entrance. One goal of the space is to promote community building, student engagement and to create a space that all students feel comfortable and welcomed to participate. However, as much as it is the desire for the space to be used by all students we are observing patterns of male dominance in the space. This is a pattern found in makerspaces. Despite the democratizing potential of makerspaces, there have also been criticisms that they are recreating some of the historic biases found in male-dominated workspaces. To counter the imbalance, our Makerspace staff propose a series of workshops, events and organizational development measures to encourage a greater number of women to embrace and use the library’s makerspace. This poster will display the work in progress of our team in developing programming opportunities to engage more females in the space. Emphasis will be on the various assessment methods we will employ to evaluate the impact of our programs and activities. Assessment activities within library makerspaces are still developing so our approach will focus on a program evaluation framework.

New Assessment Design With a Pedagogical Change: Information Literacy Instruction For Second-Year Chemistry Course – Shiyi Xie (Western University) & Jason Dyck (Western University)

Our poster and lightning presentation will present the assessment design for a pedagogical change project for information literacy instruction in a second-year chemistry course. The project aims to investigate a more effective instructional structure, including e-learning and active learning strategies, to engage students’ learning. An online library module has been developed as an important component of a flipped classroom strategy and a platform for assessments. Here is a brief description of the course plan and the assessments. Prior to the face-to-face library session, students will go through the library module and take a quiz that assesses their learning from the module. By analyzing the quiz results, the librarian will identify which topics to be covered in the session. Students can also vote for the topics using the poll tool. The librarian has a list of class activities pre-designed for all the main topics, but the session will only include those for the topics identified by the quiz and the poll results. After the library session, students will be able to retake the quiz to assess their information literacy skills, and the quiz is worth a small percentage of their course grade. Two sets of quizzes (A, B) are developed at an identical difficulty level. Students who work on Quiz A prior to the library session will take Quiz B after the session, and vice versa. A short survey will be available for students to give their feedback. The poster will include an overview of the project and the assessment pieces mentioned above.

Preservation Needs Assessment for Media-Bound Digital Content – Jess Whyte (University of Toronto)

Media-bound digital content – that is, digital holdings currently held only on media carriers such as floppy disks, optical media, flash drives, USB keys, ZIP disks, hard disk drives (HDDs), and other digital media that is not easily accessible – presents multiple and unique risk factors for preservation. Before the University of Toronto Libraries could begin to mitigate those risks, it first needed to asses

  1. The extent of these materials within its collections,
  2. Their locations within the library system,
  3. Their local stewards, and the
  4. Needs of those stewards.

This poster describes the preservation needs assessment methodology, its findings, and recommendations for future iterations or application by others.

Questioning Ask: Assessing a Collaborative Virtual Reference Service – Sabina Pagotto (Scholars Portal) & Kathryn Barrett (University of Toronto Scarborough)

Ask a Librarian is a collaborative chat reference service administered by the Ontario Council of University Libraries (OCUL) via their service arm Scholars Portal. In 2018, a joint project between Scholars Portal and the University of Toronto Libraries, the largest participating institution, examined whether the service model, staffing choices, and policies of its chat reference service were associated with user dissatisfaction, aiming to identify areas where the collaboration is successful and areas which could be improved. The research team examined transcripts, metadata, and exit survey results from 473 chat interactions originating from 13 universities between June and December 2016. Transcripts were coded for mismatches between the chat operator and user’s institutions, and reveals of such a mismatch; user, staff, and question type; how busy the shift was; and proximity to the end of a shift or service closure. Chi-square tests and a binary logistic regression were performed to determine the relationship of variables with user dissatisfaction. The results largely reaffirm Ask a Librarian’s service model, staffing practices, and policies. Users are not dissatisfied with the service received from chat operators at partner institutions, or by service provided by non-librarians, and current policies for scheduling, service closure, and handling shift changes are appropriate. However, the analysis did uncover areas of concern to investigate further: certain user types were more likely to be dissatisfied, indicating that Ask could improve service to those user types; and users were more likely to be dissatisfied if they knew they were being served by an operator from a different institution, indicating that Scholars Portal should review policies around disclosure. Overall, this exercise demonstrates that institutions can trust the consortium with their local users’ needs, and underscores the need for periodic service review to ensure continued effectiveness.

Taking Snapshots: The Role of Photographic Observations in Library Space Assessments – Giovanna Badia (McGill University)

Heading counting and ethnographic studies of spaces can be labour-intensive when the areas under observation are large in size. Taking photographs can speed up the data collection process, and assist in analyzing the data by providing additional details that were not originally captured by the researcher. Reports of space assessments in the literature describe multiple data collection methods as well as present results using various charts and tables. This poster has three objectives:

  1. To describe common best practices in the literature for using photography as a data collection method to answer questions about spaces;
  2. To provide practical strategies for extracting and analyzing relevant information from the photographs taken; and
  3. To summarize the types of data visualizations used in published studies that presented results captured by photography.

The findings of this poster provide direction for when and how to use photographic observations to evaluate library spaces and identify how to effectively communicate the results of this activity, both of which will be useful to assessment practitioners.

Using GreenGlass to Support a Book Weeding Project – Ingrid Moisil (University of Ottawa) & Sarah Simpkin (University of Ottawa)

The Library of University of Ottawa embarked on a large -scale book weeding project early spring 2016. We used GreenGlass, an online tool developed by Sustainable Collection Services and later acquired by OCLC, that allowed librarians to analyse the book collection and prepare the deselection lists. This poster discusses how we set up GreenGlass, the results and lessons learned.

Back to Top