Institutional policy frameworks should provide not only an enabling environment for the optimal and ethical harvesting and use of data, but also clarify: who benefits and under what conditions, establish conditions for consent and the deidentification of data, and address issues of vulnerability and harm. A directed content analysis of the policy frameworks of two large distance education institutions shows that current policy frameworks do not facilitate the provision of an enabling environment for learning analytics to fulfil its promise.
This paper focuses on work conducted at The Open University (OU), one of the world’s largest distance learning institutions, into predicting students who are at risk of failing a module. Since tutors at the university do not interact face to face with students, it can be difficult for to identify and respond to students who are struggling in time to resolve the difficulty. Predictive models were developed and tested using historic Virtual Learning Environment (VLE) activity data, combined with other data sources, for three OU modules. These revealed that it is possible to predict student failure by looking for changes in user activity in the VLE. More focused analysis of these modules yielded some promising results for the creation of accurate hypotheses about students who fail.
Comment by Martyn Cooper: This paper reports an investigation in online tutoring of the relation between learning analytic metrics and retention. It highlights the important result that little can be deduced from absolute levels of interaction but changes in level of interaction are significant. Although a limited study in the context of one VLE and a few modules it is likely to be a paper with wider significance.