By Cameron Miranda-Radbord and Maia De Caro

Early on in our class, we read articles and had discussions surrounding audit, evaluation, and efficacy, which influenced our decision to look into assessment processes within the division of Student Life. In this case, assessment is used to identify and understand how effective programs are at meeting their goals which center student success and development. Our exploration led us to the Signature Program Assessments (SPAs), an initiative launched alongside the 2021-2026 Strategic Plan. SPAs were created to provide a unit-level assessment tool connected to the Plan’s objectives, providing insights into how programs are achieving their goals and rationales behind decisions made by program leads. They operate within a structured framework that helps organize the assessment process.
In the early stages of our fieldwork, we discovered that SPAs were put on hold due to staff turnover across the division and the prioritization of developing the 2026 Strategic Plan. As a result, we shifted our focus to examining units that had previously completed their SPA—Starting Point (Maia) and Accessibility (Cameron). In doing this, we observed notable differences in how each unit approached both SPAs and assessment overall. While Accessibility Services found SPAs to be highly valuable, integrating them directly into their assessment, Starting Point relied on their own assessment methods, treating SPAs as a secondary consideration.
Starting Point engaged with SPAs minimally, using them more as a prompt to explore utilizing their internal and existing data to make improvements, rather than as a framework to guide their assessment practices. The unit prioritized its own methods of data collection and analysis instead. Previously, the program operated on a year-long model, which has had a consistent issue of high enrollment and low completion rates over the past few years. Last year, the Lead Coordinator conducted an internal assessment, where he recognized that gap and thought of possible improvements to address it, which ultimately led to restructuring the program into a tiered system. While Starting Point’s SPA was completed, it primarily involved copying and pasting data from their internal assessment.
Comparatively, Accessibility Services saw SPAs as an internal opportunity for reflection. In their two SPAs, concentrating on the Front Desk Appointments and Emails to Advisors, respectively, staff posed forward-looking questions to assess their programs and guide their SPAs – for example, “Are there more efficient & effective ways to manage the flow of incoming email/phone requests from students to connect with resources (both within and beyond our office)?” Stemming from this question, the SPA went on to describe the advantages and procurement of a new E-Ticketing system to replace direct emails. Comparatively, the same section on the Starting Point SPA contained what seemed to be a boilerplate explanation of the general purpose and mandate of the service, with the only potential change discussed being: “[Starting Point]… doesn’t currently draw on Indigenous approaches of learning and ways of knowing but maybe it should.” No plan or proposal for action on “Indigenous approaches of learning” followed. From learning this, we gathered that each unit valued and prioritized different components of their SPAs, likely due to their distinct purposes.
The Assessment Cycle forms the foundation of SPAs and is designed to guide the assessment process by establishing clear benchmarks aimed at identifying gaps and driving improvements—key principles of assessment. However, our analysis revealed that while both programs adhered to these principles, they did not both follow the Cycle as the basis of the SPA template, as Starting Point employed its own methods, deviating from this framework.