Learning Analytics Faculty Partnership Project – Student Learning and the Use of Lecture Recordings

Learning Analytics Faculty Partnership Project – Student Learning and the Use of Lecture Recordings

Diving in

During the pandemic, the digital delivery of lectures made recordings very easy to create and share with students. As a result, many students have become accustomed to incorporating them into their learning and continue to do so today.

However, while some instructors share this resource with students, others may deliberately not provide lecture recordings — two opposite approaches with the same goal of improving students’ learning. Indeed, educators may be unsure of the potential benefits and downsides to lecture recordings, as research into their efficacy has shown mixed results [e.g., 1, 2, 3, 4, 5, 6]. Educators might wonder if students will forego attending lectures in-person to watch the recordings online, whether the replayability of recordings allows students to take better lecture notes, and more.

The Questions

How do students engage with lecture recordings?

  • When do they watch or listen to the recordings?
  • How many times do they watch or listen to the recordings?
  • Which lectures get watched or listened to more than others?
  • How does watching or listening to lecture recordings correlate with other student behaviours (e.g. does it affect their attendance or note-taking behaviours)?

How does lecture recording use vary by student?

  • Does students’ prior disciplinary knowledge affect their use of lecture recordings?
  • Do students’ metacognitive skills affect their use of lecture recordings?

What impact do lecture recordings have on student learning?

  • Is there a correlation between lecture recording use and performance on course assessments, such as assignments and exams?

The Approach

In 2025 WT1, Dr. Graves used Panopto to record his lectures in two parallel offerings of ECON_V 326, a large economics course on Methods of Empirical Research. He used a staggered release schedule to grant students access to the recordings via Kaltura. For the first third of the term, lectures were recorded, but no students had access to the recordings. In the middle third of the course, one section of students was selected at random and given access to the lecture recordings (both all the recordings to date and new recordings after each lecture). In the final third of the term, both sections had access to the lecture recordings (again, all historical recordings, and new recordings after each lecture). 

Data from the LA Team

The Learning Analytics team will facilitate access to data relating to student use of:

  1. Lecture recordings from Kaltura, the media platform used to store and share videos in UBC courses; and
  2. Canvas, the Learning Management System used across UBC courses.

While Panopto is being used for lecture capture, Kaltura is being used for playback. This is because Kaltura provides additional per-viewer analytics not available in Panopto. Kaltura provides viewing logs down to a video-percentile level of granularity: each video is segmented into 100 slices, and daily per-viewer activity is tracked for each slice. This level of detail enables analysis of which users are watching videos, which videos they are watching, when in the term are they watching the videos, and (within a video) which specific segments they are watching.

From Canvas, the LA team will facilitate access to data both explicitly and implicitly collected as part of the course offering. Examples of explicit data collection include grades on course assessments, survey responses (including a metacognitive inventory offered as a Canvas Quiz), and participation in course activities such as discussions. Whereas the implicit data includes logs of which students are accessing which course resources, and when.

Preserving student privacy will be top of mind during this process: The Learning Analytics team will use student identifiers (such as student number) to join data across Kaltura and Canvas and to opt-out any students who declined to participate in the research study, but all direct identifiers (name, student number, etc.) will be removed from the dataset before it is provided to Dr. Graves, and final course grades will also be submitted before Dr. Graves accesses the data. This study was reviewed and approved by the UBC Office of Research Ethics (H25-02012).

Video Analytics

Screenshot of a Kaltura page containing the following elements: 'Back to media centre' (a link); 'New Video for Analytics' (a heading); 'Video' (a text element); 'Owner: ido achrak' (a text element); 'Export' (a select field); 'Filter' (a select field); 'Feb 7, 2020 - Mar 7, 2020' (a select field); '26 Player Impressions' (a text element); '11 Plays' (a text element); '2 Unique Viewers' (a text element); '11 Minutes Viewed' (a text element); '78.2% Avg. Completion Rate' (a text element); 'Social Highlights' (a heading); '0 Likes' (a text element); '0 Shares' (a text element); a cut-off image of a Kaltura page.
Kaltura provides analytics/details on the life of a lecture recording. This screenshot offers some big-picture data on a given video, such as: how many times has the video been played? How many different students have watched the video? How much of the video is typically watched?Source: Kaltura Knowledge Centre
Screenshot of a Kaltura page containing the following elements: a graph charting 'Views' (in blue) and 'Unique Authenticated Views' (in green) over a time span of '3:54', charting a decline for both metrics between a range of ~70 to ~30; 'Hide Engagement Per User' (a control); '69 Viewers' (a text element); an empty text field with placeholder text that reads 'Filter Users'; a cut-off table showing four expandable rows with the following column headers on the table: 'Name', 'Player Impressions', 'Plays' (sortable), 'Avg. Completion Rate', and 'Total Completion Rate'. The four rows have the following groups of values: First row - Name: 'meshpretzel', Player Impressions: '5', Plays: '4', Avg. Completion Rate: '67%', Total Completion Rate: '100%'; Second row - Name: 'rosaryfuture', Player Impressions: '4', Plays: '3', Avg. Completion Rate: '47%', Total Completion Rate: '100%'; Third row - Name: 'zeusresearcher', Player Impressions: '4', Plays: '3', Avg. Completion Rate: '38.7%', Total Completion Rate: '89%'; Fourth row - Name: 'statehawkins', Player Impressions: '2', Plays: '2', Avg. Completion Rate: '40%', Total Completion Rate: '80%'.
Kaltura also provides more detailed analytics. Using the graph provided here, we can zero in on the number of plays at each moment in the video. We can also see by-user data in the table, telling us how many times each student has watched the video. For the data used in this project, we will also be accessing by-date granularity. Source: Kaltura Knowledge Centre

Data Analysis

The staggered video access will allow for multiple differential analyses. For example, it will be possible to compare aggregate LMS activity across sections in the first and middle thirds of the course: in the first third, neither section had access to lecture recordings, whereas in the middle third, one section did. If we hypothesized that viewing lecture recordings acted as a replacement activity for other means of engaging with course materials and their availability had a corresponding negative effect on LMS usage, then we would expect to see a decline in LMS activity in one section but not the other. Comparing the middle and final thirds of the course, we have a second opportunity to evaluate the same hypothesis: the section that gains access to the lecture recordings for only the final third of the course should display a decrease in LMS activity, whereas LMS activity for the other section should be unchanged. Of course, we also expect to see temporal trends in patterns of engagement; but the parallel course section design used in this project enables us to tease out what effects may be attributed to the treatment (access to lecture videos).

Experimental Design

This study uses staggered access to lecture recordings, based on timing and section. But why choose this approach over a simpler alternative? Careful consideration has gone into the experimental design decision, including to avoid potential confounds and to prioritize fairness in students’ experiences. The below sections outline this reasoning.

Option: Access based on section

Description: Section A has access to lecture recordings (treatment), while section B does not have access to lecture recordings (control).

Discussion

This design may lead to section-based effects which confound the results. For example, high-performing students may be over-represented in Section A. Or perhaps Dr. Graves’s lecture delivery may be substantially better at one time of day compared to the other, leading to more engaged students in that section. Both cases could give the false impression that the treatment (access to lecture recordings) boosts grades.

Option: Access based on random assignment

Description: 50% of students (across all sections) have access to lecture recordings (treatment), the other 50% do not have access to lecture recordings (control).

Discussion

Within a course section, there may be concerns from students about fairness and equity in teaching. Though this could also be a concern with section-based access, it is standard teaching practice for different course sections to have slightly different experiences (e.g., different instructors, schedules, classrooms, etc.). It is less common to have different experiences within a given section of a course. Thus, control-group students may feel disadvantaged when their peers have access to lecture recordings while they do not.

Option: Access based on timing

Description: Initially, no students have access to the lecture recordings, then partway through the course (e.g., after the first midterm), all students have access to lecture recordings.

Discussion

There may be time-based effects that confound the results. For example, students might view the midterm exam as a wake-up call and significantly improve their study behaviour. This could give the false impression that access to lecture recordings boosts students’ performance.

Chosen Option: Staggered access

Description: For the first third of the term, no students have access to the lecture recordings. Then, in the middle third of the term, section A has access to all lecture recordings. In the final third of the term, both sections have access to the lecture recordings.

Discussion

The staggered release design tries to address all the aforementioned challenges with the other potential experimental designs. Unlike section-based access, all students will experience both no access and access (albeit on different schedules), lessening the impact of section-level confounding effects. Unlike randomly-assigned access, all students in the same section have the same experience. In fact, the experience only differs by section in the middle third of the course – for the first third no students have access to lecture recordings, while in the last third all students have access. Unlike purely timing-based access, only one section will receive lecture access at a time. This combats potential timing-based confounds (e.g., the midterm wakeup-call effect), which would appear in both sections at the same time. 

Project Timeline

(MilestoneJune 2025 – LTIC’s Learning Analytics team accepts Dr. Graves’s project proposal.

Dr. Graves applies and is selected in the very first round of applications for the new Learning Analytics Faculty Partnership Program. The program is intended to connect faculty with pedagogical research questions with the data they need to find answers. Successful applicants receive support from the LA team that is tailored to their needs, whether it’s facilitating access to learning data, developing visualizations, or assisting with data analysis.

Summer 2025 – BREB approves study.

Dr. Graves submits a proposal for the project to the Behavioural Research Ethics Board (BREB) at UBC, and receives their approval to begin the study. As part of the informed consent process, students will be informed of the research and given the option to ask questions and/or opt out of participating.

September – December 2025 – Dr. Graves teaches with A/B testing, by varying lecture recording access across course sections.

Dr. Graves teaches two sections of a large undergraduate economics course at UBC, ECON 326: Econometrics II. He records his weekly in-person lectures. After the first midterm, the students in one section gain access to all lecture recordings via the video-sharing platform Kaltura. After the second midterm, the students in the other section gain access.

(MilestoneJanuary 2026 – LTIC gathers the data and prepares it for analysis.

The Learning Analytics team at LTIC writes Python scripts to extract data from the Kaltura API – essentially grabbing all the data that Kaltura Analytics provides to instructors on students’ use of the lecture recordings. This high granularity data will permit a variety of questions to be investigated.

February – May 2026 – Dr. Graves analyzes student engagement and outcomes.

(MilestoneSummer 2026 – Dissemination of results… and/or planning for further data collection!

Why it matters

Meet the Team

  • Jonathan Graves, Associate Professor of Teaching & Director of Undergraduate Studies (Curriculum and Students), Vancouver School of Economics

Meet the Learning Analytics Team at the Learning Technology Innovation Centre

  • Craig Thompson, Programmer Analyst II, Learning Analytics at LTIC
  • Anna Stacey, Programmer Analyst, Learning Analytics at LTIC

Citations

  • [1] Williams, A., Birch, E., & Hancock, P. (2012). The impact of online lecture recordings on student performance. Australasian Journal of Educational Technology, 28(2).
  • [2] Nordmann, E., Calder, C., Bishop, P., Irwin, A., & Comber, D. (2019). Turn up, tune in, don’t drop out: The relationship between lecture attendance, use of lecture recordings, and achievement at different levels of study. Higher Education, 77, 1065-1084.
  • [3] Morris, N. P., Swinnerton, B., & Coop, T. (2019). Lecture recordings to support learning: A contested space between students and teachers. Computers & Education, 140, 103604.
  • [4] MacKay, J. R. (2019). Show and ‘tool’: How lecture recording transforms staff and student perspectives on lectures in higher education. Computers & Education, 140, 103593.
  • [5] Artz, B., Johnson, M., Robson, D., & Siemers, S. (2022). Live or lecture capture: evidence from a classroom random control trial. International Review of Economics Education, 40, 100240.
  • [6] Leadbeater, W., Shuttleworth, T., Couperthwaite, J., & Nightingale, K. P. (2013). Evaluating the use and impact of lecture recording in undergraduates: Evidence for distinct approaches by different groups of students. Computers & Education, 61, 185-192.