Guest blog by Dr. Peter Strelan
I coordinate and teach a large first year Research Methods course in Psychology. As much as possible, I try to use student data and experience as a feedback loop to illustrate key principles in the course. Having collaborated with Daniel Barry from the Analytics team in Learning Enhancement and Innovation, I’m now able to share a valuable set of results from previous cohorts’ performance in this course, to illustrate to students concepts of longitudinal design, quasi-experimental design, and confounding variables.
At the same time, I’m able to use these data to impress upon students a practical point that is very close to their heart: The data from previous years (Figure i) clearly show a linear trend insofar as the more students engage with MyUni, the more likely they are to achieve a higher mark in the course overall.
Figure i. Research methods in Psychology (2016-7) Activity and Final Mark Scatter Chart.
It is worth noting that some successful students in my course are relatively inactive whilst other, more active ones, fail. Less active students however have a much broader range of outcomes compared to those that are more active. Of the 732 students who remained enroled in my course throughout the semester, 167 were active on over 32% of days. Each of these students passed with 35 achieving an HD. Of the remaining 565, 134 failed, though 29 achieved an HD.
Looking past the outliers and at average activity rates grouped according to final grade it’s possible to plot the trajectories typical of students that achieve specific outcomes.
Figure ii. Average Activity Percentages by Final Grade.
Figure ii shows that HD students in my course logged in 5 days a fortnight, compared to 4 days for both D and C students, 3 days for P students and just twice for those that failed. Students that failed no submission logged in just once every 3 weeks.
Figure iii. Weekly Average Activity Rates by Final Grade.
A look at student engagement through the semester (Figure iii) shows that higher grading students were consistently more active than their peers from the start. The differences in activity shown at the end of semester (Figure ii) were evident in the first week with students who go on to achieve an HD separating from D and C students, who maintain very similar rates. The average activity rates of students who achieve a P is already behind those students but ahead of those that failed.
Being able to convey this information in a clear and visual way encourages student engagement, and while it doesn’t chart a clear pathway to success, it shows the sort of effort required to get there. Ensuring that students have sufficient content to engage with online, makes it possible to identify the less active ones early enough for them to alter their trajectory. When a student’s level of engagement isn’t typical of those who passed in previous years that information can form the basis of targeted messaging. This ability to identify students at risk, ones that you might not have otherwise considered before it was too late, demonstrates the efficacy of this type of analytical data.