Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

This paper presents an analytics dashboard that has been developed for designers of interactive e-books. This is part of the EU-funded MC Squared project that is developing a platform for authoring interactive educational e-books. The primary objective is to develop technologies and re- sources that enhance creative thinking for both designers (authors) and learners. The learning material is expected to offer learners opportunities to engage creatively with mathematical problems and develop creative mathematical think- ing. The analytics dashboard is designed to increase authors' awareness so that they can make informed decisions on how to redesign and improve the e-books. This paper presents architectural and design decisions on key features of the dashboard, and discusses the evaluation of a high- fidelity prototype. We discuss our future steps and some findings related to use of the dashboard for exploratory data analysis that we believe generalise to similar work.

Citation: Sokratis Karkalas, Manolis Mavrikis and Oliver Labs (2016). "Towards Analytics for Educational Interactive e-Books. The case of the Reflective Designer Analytics Platform (RDAP)". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

Since LAK2015 an increasing number of researchers are taking learning design into consideration when predicting learning behavior and outcomes across different modules. Learning design is widely studied in the Higher Education sector, but few studies have empirically connected learning designs of a substantial number of courses with learning behavior in Learning Management Systems (LMSs) and learning performance. This study builds on preliminary learning design work that was presented at LAK2015 by the Open University UK. In this study we linked 151 modules and 111.256 students with students' behavior (

Citation: Bart Rienties and Lisette Toetenel (2016). "The impact of 151 learning designs on student satisfaction and performance: social learning (analytics) matters". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

We analyse engagement and performance data arising from participants' interactions with an in-house LMS at Imperial College London while a cohort of students follow two courses on a new fully online postgraduate degree in Management. We identify and investigate two main questions relating to the relationships between engagement and performance, drawing recommendations for improved guidelines to inform the design of such courses.

Citation: Marc Wells, Alex Wollenschlaeger, David Lefevre, George Magoulas and Alexandra Poulovassilis (2016). "Analysing Engagement in an Online Management Programme and Implications for Course Design". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: A: Learning | Polarity: | Sector: | Country:

This paper describes the development and evaluation of an affect-aware intelligent support component that is part of a learning environment known as iTalk2Learn. The intelligent support component is able to tailor feedback according to a student's affective state, which is deduced both from speech and interaction. The affect prediction is used to determine which type of feedback is provided and how that feedback is presented (interruptive or non-interruptive). The system includes two Bayesian networks that were trained with data gathered in a series of ecologically-valid Wizard-of-Oz studies, where the effect of the type of feedback and the presentation of feedback on students' affective states was investigated. This paper reports results from an experiment that compared a version that provided affect-aware feedback (affect condition) with one that provided feedback based on performance only (non-affect condition). Results show that students who were in the affect condition were less off-task, a result that was statistically significant. Additionally, the results indicate that students in the affect condition were less bored. The results also show that students in both conditions made learning gains that were statistically significant, while students in the affect condition had higher learning gains than those in the non-affect condition, although this result was not statistically significant in this study's sample. Taken all together, the results point to the potential and positive impact of affect-aware intelligent support.

Citation: Beate Grawemeyer, Manolis Mavrikis, Wayne Holmes, Sergio Gutierrez-Santos, Michael Wiedmann and Nikol Rummel (2016). "Affecting Off-Task Behaviour: How Affect-aware Feedback Can Improve Student Learning". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

This paper explores the potential of analytics for improving accessibility of e-learning systems and for supporting disabled learners in their studies.

The work is set in the context of learning and academic analytics' focus on issues of retention. The definitions of disability and accessibility in e-learning are outlined and the implications of these for how disabled students needs may be modeled in learning analytics systems briefly discussed.

A comparative analysis of completion rates between disabled and non-disabled students in a large data set of 5 years of Open University modules is presented. The wide variation in comparative retention rates is noted and characterized. A key assertion of this paper are that learning analytics provide a set of tools for identifying and understanding such discrepancies and that analytics can be used to focus interventions that will improve the retention of disabled students in particular. A comparison of this quantitative approach with that of qualitative end of module surveys is made. How an approach called Critical Learning Paths, currently being researched, may be used to identify accessibility deficits in module components that are significantly impacting on the learning of disabled students is described.

An outline agenda for onward research currently being planned is given.

It is hoped that this paper will stimulate a wider interest in the potential benefits of learning analytics for higher educational institutions as they try to assure the accessibility of their e-learning and for the provision of support for disabled students.

Citation: Martyn Cooper, Rebecca Ferguson and Annika Wolff (2016). "What Can Analytics Contribute to Accessibility in e-Learning Systems and to Disabled Students' Learning?". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

Systematic investigation of the collaborative problem solving process in open-ended, hands-on, physical computing design tasks requires a framework that highlights the main process features, stages and actions that then can be used to provide '˜meaningful' learning analytics data. This paper presents an analysis framework that can be used to identify crucial aspects of the collaborative problem solving process in practice-based learning activities. We deployed a mixed-methods approach that allowed us to generate an analysis framework that is theoretically robust, and generalizable. Additionally, the framework is grounded in data and hence applicable to real-life learning contexts. This paper presents how our framework was developed and how it can be used to analyse data. We argue for the value of effective analysis frameworks in the generation and presentation of learning analytics for practice-based learning activities.

Citation: Mutlu Cukurova, Katerina Avramides, Daniel Spikol, Rose Luckin and Manolis Mavrikis (2016). "An Analysis Framework for Collaborative Problem Solving in Practice-based Learning Activities: A Mixed-method Approach". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

As the field of learning analytics continues to mature, there is a corresponding evolution and sophistication of the associated analytical methods and techniques. In this regard social network analysis (SNA) has emerged as one of the cornerstones of learning analytics methodologies. However, despite the noted importance of social networks for facilitating the learning process, it remains unclear how and to what extent such network measures are associated with specific learning outcomes. Motivated by Simmel's theory of social interactions and building on the argument that social centrality does not always imply benefits, this study aimed to further contribute to the understanding of the association between students' social centrality and their academic performance. The study reveals that learning analytics research drawing on SNA should incorporate both - descriptive and statistical methods to provide a more comprehensive and holistic understanding of a students' network position. In so doing researchers can undertake more nuanced and contextually salient inferences about learning in network settings. Specifically, we show how differences in the factors framing students' interactions within two instances of a MOOC affect the association between the three social network centrality measures (i.e., degree, closeness, and betweenness) and the final course outcome.

Citation: Srećko Joksimović, Areti Manataki, Dragan Gašević, Shane Dawson, Vitomir Kovanović and Inés Friss de Kereki (2016). "Translating network position into performance: Importance of Centrality in Different Network Configurations". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: A: Learning | Polarity: | Sector: | Country:

Evidence of the month: April 2016

This report documents the emerging uses of learning analytics in the United States, Australia and the United Kingdom. Through a series of 11 case studies, it presents an overview of the evidence currently available of the impact that analytics are having on teaching and learning – and highlights some of the opportunities for the UK higher education sector.

The report was commissioned by Jisc, which is currently working with 50 universities in the UK to set up a national learning analytics service for higher and further education.

Although learning analytics are still at a relatively early stage of development, this report finds that there is convincing evidence from early adopters that they will help to improve outcomes.

The case studies presented in the report provide a snapshot of some of the most prominent institution-level learning analytics initiatives worldwide. Together they provide evidence that:

  • researchers have demonstrated the validity of the predictive models used by learning analytics systems
  • interventions carried out with students have been effective
  • there are other benefits to taking a more data-driven approach to higher education provision

Extrapolating from current practice, in the UK and internationally, the authors anticipate that learning analytics could make significant contributions:

  • As a tool for quality assurance and quality improvement
  • As a tool for boosting retention rates
  • As a tool for assessing and acting upon differential outcomes among the student population
  • As an enabler for the development and introduction of adaptive learning

The full case studies, with references, are available at

Citation: Sclater, N., Peasgood, A. & Mullan, J. (2016). Learning Analytics in Higher Education: A Review of UK and International Practice. Jisc, Bristol. | Url:

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

This paper is the first empirical study linking learning design information for a substantial number of courses with Learning Management Systems (LMS)/Virtual Learning Environment (VLE) data and with learning performance.

The paper used hand-coded learning design data for 87 modules: the amount of time learners were expected to spend on each of seven types of learning design activity: assimilative; finding & handling information; communication; productive; experiential; interactive/adaptive; assessment. A cluster analysis of design of these modules suggested they fell in to four broad groups:: constructivist, assessment-driven, balanced-variety and social constructivist modules.

The authors found significant correlations between aspects of learning design and LMS/VLE usage data. Learners on the constructivist modules spent significantly more time on the LMS/VLE than learners on the other modules. The learning design decisions of teachers seemed to strongly influence how students were engaging with the LMS/VLE. Specifically, when more inquiry- or social constructivist learning activities were included in the learning design, learners tended to use the LMS/VLE more.

Finally, they explored the link with learning performance (completion and pass rates). They found no correlation between LMS/VLE activity and learning performance. However, there was a significant negative correlation between assimilative learning activity and learning performance, i.e. modules that had more time devoted to reading/watching/listening activity had significantly lower completion and pass rates.

The authors note that this was a small sample: full data was only available for 40 modules. However, it is a good first example of a study linking learning design with VLE/LMS data and with learning performance, and is evidence that the choices teachers make in learning design terms do have an effect on the learners, both in terms of VLE/LMS activity and learning performance.

(A further study with 157 modules is in press: Toetenel, Lisette and Rienties, Bart (2016). Analysing 157 Learning Designs using Learning Analytic approaches as a means to evaluate the impact of pedagogical decision-making. British Journal of Educational Technology (in press).)


Citation: Bart Rienties, Lisette Toetenel, and Annie Bryan. 2015. "Scaling up" learning design: impact of learning design activities on LMS behavior and performance. In Proceedings of the Fifth International Conference on Learning Analytics And Knowledge (LAK '15). ACM, New York, NY, USA, 315-319. | Url:

Type: Evidence | Proposition: C: Uptake | Polarity: | Sector: | Country:

Brockenhurst College, in the south of England, implemented learning analytics in September 2015, with the aim of improving student recruitment and retention by 15% over the next five years.

The predictive analytics used by the further education college (catering mainly for students aged 16 to 19) are based on a regression model that employs five years of student data. When the model was tested on a previous year's data, its predictions were correct in 87% of cases.

A start-of-term model identifies students who are likely to be most at risk of drop-out before they join the college. An in-term model looks at student behaviours once they join the college. The two models are compared to see if an individual student's risk levels are rising or falling.

Using learning analytics has changed the way in which the college deals with interventions and how it supports students who are experiencing problems. Previously, tutors identified and reacted to problems. This could mean that it would take three or four weeks before an intervention strategy was in place – a long time in a 30-week programme.

Now, data from the predictive model is shared early on, allowing proactive as well as reactive interventions. This is not a black-box model – predictions are shared together with the factors that the model suggests are making a student high risk.

Analytics are shared not only with staff, but also with students. Visualisations show students whether their individual risk of drop-out is rising or falling, but do not compare their risk with that of other students. The intention is to spark conversations between students and tutors about the factors that influence drop-out and how these can be addressed.

In the video referenced here, Dom Chapman, Director of Learners at the college, talks to the 2015 conference of the Association of Learning Technology (ALT) about the initiative.

The analytics have only just been rolled out, so there is currently no evidence of their effect on teaching and learning, but the video does provide evidence that analytics are being rolled out at scale – in this case at a college with 8,000 learners.

Citation: Association for Learning Technology (2015), '2015 #altc: Invited Speaker - Dom Chapman', YouTube video | Url: