Analytics can be a powerful tool to identify students at-risk of failing or dropping out. But sometimes analytics gets it wrong, and if used in inappropriate ways, can be very harmful to both the student and the institution. What can be done to reduce the risk that analytics causes more problems than it can solve?
Here are 10 suggestions:
- Acknowledge the reality that the data colleges and universities currently use to identify at-risk students only tells a small part of one side of the story. Usually, it does not provide a complete picture of what happens in the “middle” of an interaction between student and institution, such as the way in which financial services or a registrar’s office deals with an issue or challenge. The college or university may actually be the key missing variable, not the student.
- Look for data that provides a more holistic and equitable understanding of risk, whether that data refers to students or to the risk of your college or university failing to deliver on its moral, contractual, pedagogical and fiduciary duties.
- Use a range of student, institutional and contextual data to understand risk and develop risk-tolerant learning experiences. Broaden the dataset based on feedback from students, faculty, learning strategists and other stakeholders wishing to utilize the data. See the data as evolving, not fixed and final.
- Understand risk as neither permanent nor based solely on an individual’s disposition, context, self-efficacy, learning style and/or disabilities and the degree to which they believe they have control over the outcome of events in their own lives. Instead, understand risk as characterized by mostly temporary conditions of misalignment between the student, the institutional and/or disciplinary context and macro-societal factors such as hunger, anxiety, health, financial hardship, sexism, racism, family conditions or social challenges.
- Acknowledge that algorithms are messy, multidimensional and fluid. Student learning and risk is never truly binary. It is always complicated, complex and needs elaboration beyond just the risk indicators embedded in any analytics. A student’s history, social situation, family circumstances, job and preferred ways of learning are variables generally not measured or accounted for.
- Move away from a deficit understanding of students at-risk to understanding risk as emerging from the interaction between the student, their learning and the organization. Understand that-risk is a condition applied not only to students, but also to courses, departments, assessments, processes and faculty/staff at an institution. It may be that the student is at-risk because the assessments of learning are poorly designed or delivered.
- Understand the limitation of the data institutions collect and the need to continually revise the datasets to include more variables and more experiences. Make these datasets work to recognize and understand the complexity of the student experience and institutional learning experiences.
- Create responses and actions for at-risk students that are not rigid or set in stone, but an invitation to conversations and dialogue aimed at supporting, helping and enabling success. Flagging a student as at-risk does not give an institution permission to act in ways that are detrimental to the student or their learning journey. Use the risk indicators as a jumping-off place for discussions about how to mitigate risk and maximize achievement.
- Be transparent. Students and staff need to know what data are collected, who is collecting it, what it’s being used for, and under what conditions it’s being used. Students and staff must also have access to processes and resources to question and/or appeal decisions based on data. They are owners of their data and, as such, have a vested interest in being part of any interventions based on its collection, analysis and use.
- Recognize that the data about students reflects not just them, but institutional decisions about course design, assessment and pedagogy. A course that constantly “produces” large numbers of students at-risk is more likely to be the problem than the students taking it. See analytics as much about course design and delivery as about the learners.
As data analytics is deployed more broadly, institutions, faculty and instructors can learn from each other about what is working, what is problematic and what challenges — operational, ethical and legal — its use can give rise to.
Institutions must be legally and publicly responsible for proven bias and/or improve use of analytics. Misuse not only detrimentally affects the student but can also damage the faculty/staff and institution. Done correctly, however, the more we know and the earlier we know it, the more effective interventions can make a positive difference.