Opportunity
Over the last few years, Dr. Katie Piatt, eLearning Services Manager, and many of her colleagues at the University of Brighton in England, are learning more about the emerging field of learning analytics, with its promises of better understanding, communication and interaction with students.
It is often assumed that universities require substantial funding and support from large, private companies to effectively take advantage of the potential offered by learning analytics. The University of Brighton decided on a different path, choosing to assess their existing learning analytics data for usefulness and possible applications, before undertaking any larger scale projects. The Virtual Learning Environment (VLE, also called a Learning Management System) has data on student access, time logged in, test grades and even library records. The focus is on what the university could learn and apply from that data.
The second significant choice was that the learning analytics would be what Dr. Piatt describes as “student facing” – offered to students to inform them about their behaviour and results, with the goal of encouraging changes, where necessary, for better success and retention. The start point is to offer students actionable insights.
The learning analytics project has two streams – one which currently collects and delivers learning analytic results to students on their VLE dashboards, while the second explores the student activity data available through the VLE for information and applications to be shared with students, as well as faculty, staff and administration.
Innovation
Student Dashboard: In academic year 2016-2017, eLearning Services began working with the Business School on a two-year pilot for delivering learning analytics to students about their current performance (students at University of Brighton study in face-to-face classes). The initial step was a presentation of emojis on students’ login page for the VLE, so students could choose one to represent how they were feeling that day.
This input is optional with a very low barrier to participation – the students click on an emoji. They are also offered the opportunity to add comments. The student dashboard offers an overview of this input – showing the range of student responses in a “heat map”.
The dashboard also shows students their grades and their level of access to the VLE, called “studentcentral”, in comparison with the average of the rest of the class. Attendance average is also presented. The University of Brighton does not have an attendance policy, but class presence is recorded. As the students expect the data presented concerning their attendance to be correct, this led to a shift to more accurate, electronic attendance records.
Student Activity Data: The second stream of the initiative looks at the analytics of student activity currently available from the VLE. A user activity count is calculated based on the number of daily visits to the VLE by each student (time on the VLE is not helpful as sessions time out after no activity – so offers no accurate measure of how long a student was engaged). These data are used to model and predict module attendance over time and final grade. The predictions are matched to the actual results.
Outcomes and Benefits
Student Dashboard: At the start of the project, staff in the eLearning Centre and the Faculty of Business brief students on the collection, presentation and use of the dashboard data, stressing it is for their guidance and information. Student responses from focus groups during the pilot study indicate positive attitudes to the information received and, in many cases, behavioural changes. For example, students with above-average attendance report they do not decrease their attendance, but that this information takes some pressure off. Those below-average want to catch up, often not realizing they were so absent. Student also log in frequently on results day, checking for the chart showing how they measure against the class average. Students appreciate the dashboard gives them access to previously unavailable data.
The heat map created from the emojis that presents the “mood” of the course makes students feel less isolated, realizing they are not the only ones having a bad (or good) day, and also indicates to them the university cares about them beyond numbers and final grades. Students have favourite emojis – including the ‘poo’ one - and suggested additions, such as the heart one. The “how are you feeling” input is not used to make predictions – only for reporting back to students.
The dashboard is visible to tutors as well as students, offering stimulus for discussion.
Student Activity Data: Students do not find the information on frequency of their logging into the VLE of particular interest, but this is the indicator of most importance to the university. Recent analysis illustrates the level of student activity on the VLE to be an accurate predictor of final results. During the research, the predictions by the researchers of student final results based on activity levels closely corelated to actual results. A significant finding is the pattern of low activity that leads to negative results emerges as early as week five.
Other findings include students who do not submit the first assignment almost always fail or drop-out. On the other hand, library data tracking loans offer no significant indicators.
Challenges and Enhancements
Any use of student data raises questions of ethics and privacy. The University of Brighton uses the data to better support students, making it all transparently available to them. Classroom briefings, focus groups, cooperation with the Students Union, and making each student’s data available on his or her dashboard help overcome the perception of “Big Brother” collecting and using hidden data. Faculty and staff access identical data as students about individual performance.
The research study and development of the student dashboard used existing university resources in terms of funding and data. The university is moving forward slowly and deliberately, using and enhancing internal skills sets to assess possibilities and measure results before making a larger-scale commitment.
The analysis showing VLE activity to be a strong predictor of final results raised the question of appropriate interventions to support students as early as week five to prevent failure and increase retention. Current remedial efforts, such as sending automatic e-mails after students miss three lectures in a row, are not found to be effective. Better interventions are being considered, especially interventions that are not automated.
Students can submit comments with their emojis, which raises a ‘duty of care’ responsibility. If a student consistently reports sadness, inadequacy or other negative emotions, a suitable and personal response is essential.
Potential
The finding that student activity level as early as week five predicts final results can be the basis for redesign of student support and interventions strategies, as well as significant course redesign. eLearning Services is working with partners throughout the university on best responses.
After the two-year pilot in the School of Business, the plan is to extend the student dashboard initiative throughout the university, starting in September 2018. eLearning Services is part of a university-wide Steering Group supporting this expansion. Schools and faculties across the university have different disciplines and patterns of data use, and so their acceptance of the data and dashboard as designed for the Business School needs to be considered. Students in some programs may not want access to comparative data on results or attendance. This must be balanced with a need for consistency and engagement.
Teaching staff throughout the university must be introduced to the dashboard, how it works, the information supplied and its implications, so they explain this to the students. As well, attendance records need to be consistent to comply with student expectations for accurate information.
University of Brighton is linked with the JISC project that is looking to develop learning analytics tools for use in institutions across the United Kingdom. JISC is the UK’s higher and further education and skills sectors’ membership-based, not-for-profit organization for digital services and solutions, with research and development as key responsibilities.
The University of Brighton recognizes the potential of learning analytics but want to choose directions and applications appropriate for their institution. The current developments are in-house; long-term, the university would like to be part of the significant national initiatives spearheaded by JISC.
For Further Information
Dr. Katie Piatt
eLearning Services Manager
University of Brighton
Brighton, England
[email protected]
Slide set: Can you predict a degree from 5 weeks worth of VLE data?