Type: Evidence | Proposition: A: Learning | Polarity: | Sector: | Country:

This article concerns the evaluation of the social capital of the European teachers participating in the eTwinning Portal activities, run by European Schoolnet and characterised by more than 160,000 registered teachers from 35 countries, involved in more than 19,000 projects (2010). This evaluation has been performed by using the Social Network Analysis approach.

The authors found that some correlations can be found "between social network analysis measures like degree and betweenness centrality as well as the local clustering coefficient, activity statistics about usage of eTwinning and the quality management of European Schoolnet".

For the analysis of eTwinning network data, three Learning Analytics tools have been developed:

  • eVa (eTwinning Network Visualization and Analysis), a network visualization and simple analysis tool.
  • CAfe (Competence Analyst for eTwinning), an SNA-based competence management and teachers' self-monitoring tool.
  • AHTC (Ad Hoc Transient Communities) services, which involve users into question-answer activities on the eTwinning Portal.
Citation: M.C. Pham, Y. Cao, Z. Petrushyna, R. Klamma. "Learning Analytics in a Teachers' Social Network" (2012). Proceedings of the 8th International Conference on Networked Learning 2012, ISBN 978-1-86220-283-2 | Url:

Type: Evidence | Proposition: D: Ethics | Polarity: | Sector: | Country:

The paper defines privacy as the regulation of how personal digital information is being observed by the self or distributed to other observers. It defines ethics as the systematization of correct and incorrect behaviour in virtual spaces according to all stakeholders.

Principles identified within this paper are:

  • transparency
  • student control over data
  • rights of access
  • accountability and assessment

The authors argue that, by discussing the various aspects within each of these categories, institutions have mechanisms to assess their initiatives and achieve compliance with current laws and regulations as well as with socially derived requirements.

In terms of the Evidence Hub proposition 'Learning analytics are used in an ethical way', the need for a paper such as this to be written implies that appropriate measures are being developed but are not yet in place. As the authors note, 'The ethical and privacy issues derived from these scenarios are not properly addressed.'

Summary provided within the paper:

What is already known about this topic
• Learning analytics offers the possibility of collecting detailed information about how students learn.
• The ethical and privacy issues derived from these scenarios are not properly addressed.
• There is a need to clarify how these issues must be addressed from the early stages of the deployment of a learning analytics scenario.
What this paper adds
• An account of how the main legislations are advancing in the general area of privacy.
• A comparison of how other disciplines such as medicine have dealt with privacy issues when collecting private information.
• The description of a group of practical principles in which to include all the ethical and privacy-related issues present when deploying a learning analytics application.
Implications for practice and/or policy
• Designers may now take into account these principles to guide the implementation of learning analytics platforms.
• Students are now aware of the different categories of issues that must be addressed by applications collecting data while they learn.
• Instructors now have a more detailed account of the issues to address when adopting learning analytics techniques.

Citation: Pardo, A., & Siemens, G. (2014). Ethical and privacy principles for learning analytics. British Journal of Educational Technology, 45(3), 438-450. | Url:

Type: Evidence | Proposition: C: Uptake | Polarity: | Sector: | Country:

Brockenhurst College, in the south of England, implemented learning analytics in September 2015, with the aim of improving student recruitment and retention by 15% over the next five years.

The predictive analytics used by the further education college (catering mainly for students aged 16 to 19) are based on a regression model that employs five years of student data. When the model was tested on a previous year's data, its predictions were correct in 87% of cases.

A start-of-term model identifies students who are likely to be most at risk of drop-out before they join the college. An in-term model looks at student behaviours once they join the college. The two models are compared to see if an individual student's risk levels are rising or falling.

Using learning analytics has changed the way in which the college deals with interventions and how it supports students who are experiencing problems. Previously, tutors identified and reacted to problems. This could mean that it would take three or four weeks before an intervention strategy was in place – a long time in a 30-week programme.

Now, data from the predictive model is shared early on, allowing proactive as well as reactive interventions. This is not a black-box model – predictions are shared together with the factors that the model suggests are making a student high risk.

Analytics are shared not only with staff, but also with students. Visualisations show students whether their individual risk of drop-out is rising or falling, but do not compare their risk with that of other students. The intention is to spark conversations between students and tutors about the factors that influence drop-out and how these can be addressed.

In the video referenced here, Dom Chapman, Director of Learners at the college, talks to the 2015 conference of the Association of Learning Technology (ALT) about the initiative.

The analytics have only just been rolled out, so there is currently no evidence of their effect on teaching and learning, but the video does provide evidence that analytics are being rolled out at scale – in this case at a college with 8,000 learners.

Citation: Association for Learning Technology (2015), '2015 #altc: Invited Speaker - Dom Chapman', YouTube video | Url:

Type: Evidence | Proposition: C: Uptake | Polarity: | Sector:

Evidence of the Month: June 2015

This literature review is intended to provide a comprehensive background for understanding current knowledge related to learning analytics and educational data mining and its impact on adaptive learning. The authors examined the literature on experimental case studies conducted in the domain between 2008 and 2013. They analysed and categorised the research questions, methodology and findings of the 40 published papers that met their inclusion criteria, and went on to evaluate and interpret the findings of these papers.

In terms of the LACE Evidence Hub, the literature review is particularly interesting, because it summarises the findings of these key paper, classifying the pedagogical results in terms of formal and non-formal learning. It thus provides an overview of current evidence in the field, as well as evidence of the scale of deployment.

Citation: Papamitsiou, Zacharoula, & Economides, Anastasios A. (2014). Learning analytics and educational data mining in practice: a systematic literature review of empirical evidence. Educational Technology & Society, 17(4), 49-64. | Url:

Type: Evidence | Proposition: D: Ethics | Polarity: | Sector:

There is growing concern about the extent to which individuals are tracked while online. Within higher education, understanding of issues surrounding student attitudes to privacy is influenced not only by the apparent ease with which members of the public seem to share the detail of their lives, but also by the traditionally paternalistic institutional culture of universities.

This paper explores issues around consent and opting in or out of data tracking. It considers how three providers of massive open online courses (Coursera, EdX and FutureLearn) inform users about data usage. It also discusses how higher education institutions can work toward an approach that engages students and informs them in more detail about the implications of learning analytics on their personal data.

The paper restates the need to

  1. develop a coherent approach to consent, taking into account the findings of research into how people make decisions about personal data
  2. recognise that people can only engage selectively in privacy self management
  3. adjust the timing of privacy law to take into account that data may be combined and reanalysed in the future
  4. develop more substantive privacy rules

This paper was nominated for the best paper award at the 2015 Learning Analytics and Knowledge conference.

Citation: Prinsloo, Paul, & Slade, Sharon. (2015). Student privacy self-management: implications for learning analytics. Paper presented at the LAK '15, Poughkeepsie, NY. DOI 10.1145/2723576.2723585 | Url:

Type: Evidence | Proposition: C: Uptake | Polarity: | Sector:

This paper reports a scientometric analysis, coupled with a manual content analysis, of the proceedings and programme of LAK 2013. Authorship analysis reveals an open and international community, while internal citation analysis provides evidence of the construction of a body of knowledge central to learning analytics.

Learning analytics is found to be a diverse multidisciplinary field which, in 2013, included 40 core researchers and around 100 additional authors. In addition to papers reflecting on the field, five main topics are identified: visualization, behaviour analysis, social learning analytics, learning analytics for MOOCs, and learning analytics issues (such as ethics and scalability). The body of knowledge central to learning analytics is small, but already shows signs of consolidation.


Scientometrics involves the use of simple statistics about authors and their papers to provide an indication of the reach, diversity, and ‘health’ of a research community and its knowledge building process).

Citation: Ochoa, Xavier, Suthers, Dan, Verbert, Katrien, & Duval, Erik. (2014). Analysis and reflections on the third Learning Analytics and Knowledge conference (LAK 2013). Journal of Learning Analytics, 1(2), 5-22. | Url:

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

This paper seeks to unite learning analytics and action research. It investigates how the multitude of questions that arise during technology-enhanced teaching and learning can be systematically mapped to sets of indicators. It identifies questions that are not yet supported and proposes indicators that offer a high potential for positively influencing teachers’ didactical considerations. The investigation shows that many questions from teachers cannot be answered using the research tools that are currently available. Furthermore,
few learning analytics studies measure impact. The paper identifies effects that learning analytics could have on teaching and discusses how this could be evaluated.

Comment from Martyn Cooper: This is a meta-study and looks at a wide range of learning analytics tools. This paper highlights that the impact of learning analytics tools has not been evaluated.  It suggests how Action Research could contribute to addressing this. It is suggested that at this stage the inclusion of this paper in an evidence hub would be useful because it highlights the lack of evidence.

Citation: A. L. Dyckhoff, V. Lukarov, A. Muslim, M. A. Chatti and U. Schroeder (2013). Supporting action research with learning analytics. In: Third International Learning Analytics and Knowledge Conference (LAK13), 8-12 April, Leuven, Belgium. | Url: