Tag Archives: Evidence of the Month

Type: Evidence | Proposition: A: Learning | Polarity: | Sector: | Country:

Evidence of the month: April 2016

This report documents the emerging uses of learning analytics in the United States, Australia and the United Kingdom. Through a series of 11 case studies, it presents an overview of the evidence currently available of the impact that analytics are having on teaching and learning – and highlights some of the opportunities for the UK higher education sector.

The report was commissioned by Jisc, which is currently working with 50 universities in the UK to set up a national learning analytics service for higher and further education.

Although learning analytics are still at a relatively early stage of development, this report finds that there is convincing evidence from early adopters that they will help to improve outcomes.

The case studies presented in the report provide a snapshot of some of the most prominent institution-level learning analytics initiatives worldwide. Together they provide evidence that:

  • researchers have demonstrated the validity of the predictive models used by learning analytics systems
  • interventions carried out with students have been effective
  • there are other benefits to taking a more data-driven approach to higher education provision

Extrapolating from current practice, in the UK and internationally, the authors anticipate that learning analytics could make significant contributions:

  • As a tool for quality assurance and quality improvement
  • As a tool for boosting retention rates
  • As a tool for assessing and acting upon differential outcomes among the student population
  • As an enabler for the development and introduction of adaptive learning

The full case studies, with references, are available at http://bit.ly/1WczdOn

Citation: Sclater, N., Peasgood, A. & Mullan, J. (2016). Learning Analytics in Higher Education: A Review of UK and International Practice. Jisc, Bristol. | Url: https://www.jisc.ac.uk/sites/default/files/learning-analytics-in-he-v2_0.pdf

Type: Evidence | Proposition: A: Learning | Polarity: | Sector: | Country:

Learning analytics makes extensive use of trace data from learners interacting with Learning Management Systems (LMS), and one of the most common uses is to derive an estimate of the time each learner spent on task, that is, engaging in particular learning activities. The authors of this paper note that while this approach is widely used, the details are often not given, and the consequences of choices made in generating these measures are not explored.

They present two experiments exploring different measures of time on task, one using data from a fully-online course at a Canadian university, and another using data from a blended course at an Australian university.

They found that time-on-task measures typically outperformed count measures for predicting student performance, but more importantly, identified that the precise choice of time-on-task measure "can have a significant effect on the overall fit of the model, its significance, and eventually on the interpretation of research findings".

Given their findings, they argue that there are implications for research in learning analytics: "Above all is the need for more caution when using time-on-task measures for building learning analytics models. Given that details of time-on-task estimation can potentially impact reported research findings, appropriately addressing time-on-task estimation becomes a critical part of standard research practice in the learning analytics community. This is particularly true in cases where time-on-task measures are not accompanied by additional measures such as counts of relevant activities."

They call for researchers to recognise the importance of time-on-task measures, for fuller detail to be given of the measures used, and for further investigation.

(This paper is an expanded version of an earlier paper at LAK15, which reported only on the data from the Canadian university: Kovanovic, Vitomir, Gasevic, Dragan, Dawson, Shane, Joksimovic, Srecko, Baker, Ryan S, & Hatala, Marek. (2015). Penetrating the black box of time-on-task estimation. Paper presented at LAK '15, Poughkeepsie, NY. DOI 10.1145/2723576.2723623.)

Citation: Kovanović, V., Gašević, D., Dawson, S., Joksimović, S., Baker, R. S., & Hatala, M. (2015) Does Time-on-task Estimation Matter? Implications on Validity of Learning Analytics Findings. | Url: https://epress.lib.uts.edu.au/journals/index.php/JLA/article/view/4501

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

This paper is the first empirical study linking learning design information for a substantial number of courses with Learning Management Systems (LMS)/Virtual Learning Environment (VLE) data and with learning performance.

The paper used hand-coded learning design data for 87 modules: the amount of time learners were expected to spend on each of seven types of learning design activity: assimilative; finding & handling information; communication; productive; experiential; interactive/adaptive; assessment. A cluster analysis of design of these modules suggested they fell in to four broad groups:: constructivist, assessment-driven, balanced-variety and social constructivist modules.

The authors found significant correlations between aspects of learning design and LMS/VLE usage data. Learners on the constructivist modules spent significantly more time on the LMS/VLE than learners on the other modules. The learning design decisions of teachers seemed to strongly influence how students were engaging with the LMS/VLE. Specifically, when more inquiry- or social constructivist learning activities were included in the learning design, learners tended to use the LMS/VLE more.

Finally, they explored the link with learning performance (completion and pass rates). They found no correlation between LMS/VLE activity and learning performance. However, there was a significant negative correlation between assimilative learning activity and learning performance, i.e. modules that had more time devoted to reading/watching/listening activity had significantly lower completion and pass rates.

The authors note that this was a small sample: full data was only available for 40 modules. However, it is a good first example of a study linking learning design with VLE/LMS data and with learning performance, and is evidence that the choices teachers make in learning design terms do have an effect on the learners, both in terms of VLE/LMS activity and learning performance.

(A further study with 157 modules is in press: Toetenel, Lisette and Rienties, Bart (2016). Analysing 157 Learning Designs using Learning Analytic approaches as a means to evaluate the impact of pedagogical decision-making. British Journal of Educational Technology (in press).)

 

Citation: Bart Rienties, Lisette Toetenel, and Annie Bryan. 2015. "Scaling up" learning design: impact of learning design activities on LMS behavior and performance. In Proceedings of the Fifth International Conference on Learning Analytics And Knowledge (LAK '15). ACM, New York, NY, USA, 315-319. | Url: http://dl.acm.org/citation.cfm?doid=2723576.2723600

Type: Evidence | Proposition: A: Learning | Polarity: | Sector: | Country:

This longitudinal study explores the effects of tracking and monitoring time devoted to learn with a mobile tool, on self-regulated learning. Graduate students (n = 36) from three different online courses used their own mobile devices to track how much time they devoted to learn over a period of four months. Repeated measures of the Online Self-Regulated Learning Questionnaire and Validity and Reliability of Time Management Questionnaire were taken along the course. Our findings reveal positive effects of tracking time on time management skills. Variations in the channel, content and timing of the mobile notifications to foster reflective practice are investigated, and time-logging patterns are described. These results not only provide evidence of the benefits of recording learning time, but also suggest relevant cues on how mobile notifications should be designed and prompted towards self-regulated learning of students in online courses.

Screen Shot 2015-12-04 at 12.03.29Screen Shot 2015-12-04 at 12.03.49 Screen Shot 2015-12-04 at 12.04.04 Screen Shot 2015-12-04 at 12.04.23

Citation: Tabuenca, B., Kalz, M., Drachsler, H., & Specht, M. (2015). Time will tell: The role of mobile learning analytics in self-regulated learning. Computers & Education, 89, 53–74. | Url: http://dspace.ou.nl/handle/1820/6172

Type: Evidence | Proposition: C: Uptake | Polarity: | Sector: | Country:

Brockenhurst College, in the south of England, implemented learning analytics in September 2015, with the aim of improving student recruitment and retention by 15% over the next five years.

The predictive analytics used by the further education college (catering mainly for students aged 16 to 19) are based on a regression model that employs five years of student data. When the model was tested on a previous year's data, its predictions were correct in 87% of cases.

A start-of-term model identifies students who are likely to be most at risk of drop-out before they join the college. An in-term model looks at student behaviours once they join the college. The two models are compared to see if an individual student's risk levels are rising or falling.

Using learning analytics has changed the way in which the college deals with interventions and how it supports students who are experiencing problems. Previously, tutors identified and reacted to problems. This could mean that it would take three or four weeks before an intervention strategy was in place – a long time in a 30-week programme.

Now, data from the predictive model is shared early on, allowing proactive as well as reactive interventions. This is not a black-box model – predictions are shared together with the factors that the model suggests are making a student high risk.

Analytics are shared not only with staff, but also with students. Visualisations show students whether their individual risk of drop-out is rising or falling, but do not compare their risk with that of other students. The intention is to spark conversations between students and tutors about the factors that influence drop-out and how these can be addressed.

In the video referenced here, Dom Chapman, Director of Learners at the college, talks to the 2015 conference of the Association of Learning Technology (ALT) about the initiative.

The analytics have only just been rolled out, so there is currently no evidence of their effect on teaching and learning, but the video does provide evidence that analytics are being rolled out at scale – in this case at a college with 8,000 learners.

Citation: Association for Learning Technology (2015), '2015 #altc: Invited Speaker - Dom Chapman', YouTube video https://www.youtube.com/watch?v=7j7IBh2Hrik | Url: https://www.youtube.com/watch?v=7j7IBh2Hrik

Type: Evidence | Proposition: C: Uptake | Polarity: | Sector: | Country:

Evidence of the month: October 2015

Uruguay was the first country in the world to meet the target of one laptop per child. The programme that achieved this, Plan Ceibal, covers all public schools in Uruguay, and around 700,000 students and teachers now have their own device. The aim is to use this technology to support and develop learning and teaching.

Plan Ceibal also operates and integrates national-scale databases, populated with data from a range of management and educational activities. These include demographic data, performance data and system usage data.

This conference paper sets out some of the steps that will be required to implement a learning analytics strategy within Plan Ceibal. This has the potential to support and develop current technology and learning educational policies across the country.

 

Citation: Bailon, Martina, Carballo, Mauro, Cobo, Cristobal, Magnone, Soledad, Marconi, Cecilia, Mateu, Matias, & Susunday, Hernan. (2015). How can Plan Ceibal land into the age of big data? Paper presented at the Fourth International Conference on Data Analytics, Nice, France. | Url: http://www.fundacionceibal.edu.uy/en/news/learninganalytics-education-edtech-and-bigdata-challenging-attractive-opportunity

Type: Evidence | Proposition: C: Uptake | Polarity: | Sector: | Country:

Evidence of the month: September 2015

In these two videos

  • https://www.youtube.com/watch?v=uu9ii38TcFI
  • https://www.youtube.com/watch?v=9Z-hp5NrSBg

Tim Renick talks to a committee of the US Senate about the successful large-scale application of predictive analytics at Georgia State University. The university aims to ensure – using reliable data – that students are doing what they need to do within the context of their ability and their resources and that they are making significant progress towards their degrees.

Renick, who is Vice Provost of the university, explains that they are using data proactively, picking up on problems that are associated with low grades or student dropout. He states that this has resulted in

  • Reduction of the average time it takes students to gain a degree
  • 1700 more students graduating annually than did so five years ago
  • Elimination of achievement gaps based on race, ethnicity and economics.

The university is currently tracking 30,000 students, using predictive modeling based on ten years of data and 800 risk factors.

Students each have individual pathways, made up of a set of courses they should be taking each semester. Those who are making mistakes will find this out almost immediately, as the analytics will trigger a one-to-one meeting with an adviser. These meetings are personalized, engaging students, who get to know their support team better than was the case in the past. Early weaknesses in performance are addressed by immediate interventions, so they are removed rather than compounded.

These short videos do not set out detailed data and analysis, but they do provide evidence of analytics being rolled out at scale, and of engagement with these analytics at a national level.

Citation: 'Georgia State University - Dr. Tim Renick' (2015), YouTube, uploaded by Georgia State University | Url: https://www.youtube.com/watch?v=9Z-hp5NrSBg

Type: Evidence | Proposition: A: Learning | Polarity: | Sector: | Country:

Evidence of the month: August 2015

This meta-analysis looked at studies that compare learning outcomes for learners using Intelligent Tutoring Systems (ITS) with learners using other learning opportunities. They found 107 studies. They found:

ITS were associated with greater achievement when compared to teacher-led group instruction, other sorts of computer instruction and textbooks.

There was no significant difference in outcomes between ITS and small-group or individualised human tutors.

The positive findings applied across all levels of education, and regardless of whether ITS were used as the main means of teaching, a supplement, part of a teacher-led strategy, or as a homework aid.

This is not a new finding: previous work has come to broadly similar conclusions. But it is a recent and large in scope.

Citation: Ma, W., Adesope, O. O., Nesbit, J. C., & Liu, Q. (2014). Intelligent tutoring systems and learning outcomes: A meta-analysis. Journal of Educational Psychology, 106(4), 901. | Url: http://www.apa.org/pubs/journals/features/edu-a0037123.pdf

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

Evidence of the month: July 2015

This paper focuses on the issues of stability and sensitivity of learning-analytics-based prediction models. It investigates whether prediction models remain the same, when the instructional context is repeated with a new cohort of students, and whether prediction models change when relevant aspects of the instructional context are adapted.

The research applies Buckingham Shum and Deakin Crick’s theoretical framework of dispositional learning analytics: an infrastructure that combines learning dispositions data with data extracted from computer-assisted, formative assessments and learning management systems.

The paper compares two cohorts of a large introductory quantitative methods module, with 1,005 students in the 2013-2014 cohort, and 1,006 students in the 2014-2015 cohort. Both modules were based on principles of blended learning – combining face-to-face problem-based learning sessions with e-tutorials – and had a similar instructional design, except for an intervention into the design of quizzes administered in the module. Focusing on their predictive power, the paper provides evidence of the stability and the sensitivity of regression-type prediction models.

The paper reports that the value of dispositional data is strongly dependent on the time at which richer (assessment) data become available, and on the need for timely signalling of under performance. If timely feedback is required, the combination of data extracted from e-tutorials (both in practice and test modes) and learning disposition data was found to be the best mix to support learning analytics applications.

Feedback related to learning dispositions (for example, by flagging suboptimal learning strategies, or inappropriate learning regulation) is generally open to interventions to improve the learning process. The same is true of feedback related to suboptimal use of e-tutorials; it is both predictive and open for intervention.

This paper won Best Paper award at the 7th International conference on Computer Supported Education

Citation: Tempelaar, D. T.; Rienties, B. and Giesbers, B. (2015). Stability and sensitivity of Learning Analytics based prediction models. In: Proceedings of 7th International conference on Computer Supported Education (Helfert, Markus ; Restivo, Maria Teresa; Zvacek, Susan and Uho, James eds.), 23-25 May 2015, Lisbon, Portugal, CSEDU, pp. 156�166. | Url: http://oro.open.ac.uk/43446/

Type: Evidence | Proposition: C: Uptake | Polarity: | Sector:

Evidence of the Month: June 2015

This literature review is intended to provide a comprehensive background for understanding current knowledge related to learning analytics and educational data mining and its impact on adaptive learning. The authors examined the literature on experimental case studies conducted in the domain between 2008 and 2013. They analysed and categorised the research questions, methodology and findings of the 40 published papers that met their inclusion criteria, and went on to evaluate and interpret the findings of these papers.

In terms of the LACE Evidence Hub, the literature review is particularly interesting, because it summarises the findings of these key paper, classifying the pedagogical results in terms of formal and non-formal learning. It thus provides an overview of current evidence in the field, as well as evidence of the scale of deployment.

Citation: Papamitsiou, Zacharoula, & Economides, Anastasios A. (2014). Learning analytics and educational data mining in practice: a systematic literature review of empirical evidence. Educational Technology & Society, 17(4), 49-64. | Url: www.ifets.info/download_pdf.php?j_id=65&a_id=1516