This article concerns the evaluation of the social capital of the European teachers participating in the eTwinning Portal activities, run by European Schoolnet and characterised by more than 160,000 registered teachers from 35 countries, involved in more than 19,000 projects (2010). This evaluation has been performed by using the Social Network Analysis approach.
The authors found that some correlations can be found "between social network analysis measures like degree and betweenness centrality as well as the local clustering coefficient, activity statistics about usage of eTwinning and the quality management of European Schoolnet".
For the analysis of eTwinning network data, three Learning Analytics tools have been developed:
eVa (eTwinning Network Visualization and Analysis), a network visualization and simple analysis tool.
CAfe (Competence Analyst for eTwinning), an SNA-based competence management and teachers' self-monitoring tool.
AHTC (Ad Hoc Transient Communities) services, which involve users into question-answer activities on the eTwinning Portal.
Affect and behaviour detectors were used to estimate student affective states and behaviour based on analysis of tutor log data. For every student action, the detectors estimated the probability that the student was in a state of boredom, engaged concentration, confusion, or frustration. They also estimated the probability that the student was engaging in off‐task or gaming behaviours.
Boredom during problem solving was negatively correlated with performance
Boredom during scaffolded tutoring was positively correlated with performance
Confusion during problem solving was negatively correlated with performance
Confusion during scaffolded tutoring was positively correlated with performance
Engaged concentration was associated with positive learning outcomes
Frustration was associated with positive learning outcomes
Gaming the system was associated with poorer learning,
A unified model was used to predict student standardized examination scores from a combination of student affect, disengaged behaviour, and performance within the learning system. This model achieved high overall correlation with standardized exam scores.
Building on this work could produce positive evidence for ‘Proposition A: learning analytics improve learning outcomes’ and ‘Proposition B: Learning analytics improve learning support and teaching’.
This study provides strong indications that there are discrete online behaviour patterns that are representative of students’ overall engagement and final assessment grade. The visualisation of online
student engagement/effort affords instructors with early opportunities for providing additional student learning assistance and intervention – when and where it is required. The capacity to establish early indicators of ‘at-risk’ students provides timely opportunities for instructors to re-direct or add resources to facilitate progression towards patterns of behaviour that represent what has been termed a 'low-risk category' (e.g. participation in group discussions, regular access of discipline content and review of assessment criteria).
The paper identifies opportunities for instructors to make these changes, but that would be a next step - so it does not provide evidence that learning analytics improve learning support and teaching, only that they offer potential to achieve this.
Citation: Dawson, Shane, McWIlliam, Erica, & Tan, Jen Pei-Ling. (2008). Teaching smarter: How mining ICT data can inform and improve learning and teaching practice. Paper presented at the ascilite 2008, Melbourne,
This paper describes the results on research work performed by the Open Academic Analytics Initiative, an on-going research project aimed at developing an early detection system of college students at academic risk, using data mining models trained using student personal and demographic data, as well as course management data. It reports initial findings on the predictive performance of those models, their portability across pilot programmes in different institutions and the results of interventions applied on those pilots.
Comment from Martyn Cooper: This paper looks at the integration of demographic information and analytics of online activity as a means of predicting students at risk. Importantly, it looks across different programmes in several institutions. Th authors claim that predictive models are more portable than may have been initially assumed. However, again the study is an initial investigation and only looked at data over one semester. They highlight that some students improve after “treatment” and others do not and this needs further investigation.
Citation: E. J. M. Lauria, E. W. Moody, S. M. Jayaprakash, N. Jonnalagadda, J. D. Baron (2013). Open academic analytics initiative: initial research findings, In: Third International Learning Analytics and Knowledge Conference (LAK13), 8-12 April, Leuven, Belgium. | Url: http://dl.acm.org/citation.cfm?id=2460296
This paper investigates the correspondence between student affect in a web-based tutoring platform throughout the school year and learning outcomes at the end of the year, on a high-stakes mathematics exam. Affect detectors were used to estimate student affective states based on post-hoc analysis of tutor log-data. For every student action in the tutor, the detectors provided an estimated probability that the student was in a state of boredom, engaged concentration, confusion, and frustration, and estimated the probability that they were exhibiting off-task or gaming behaviours. Boredom during problem solving was found to be negatively correlated with performance, however, boredom was positively correlated with performance when exhibited during scaffolded tutoring. A similar pattern was seen for confusion. Engaged concentration and, surprisingly, frustration were both associated with positive learning outcomes.
Comment from Martyn Cooper: This paper reports an investigation of the relationship between affective states of students as noted in online tutor logs and the performance in summative high stakes exams. An interesting result was that students who were bored or confused while answering the main problems, tended to do poorly on the test; however, boredom and confusion on scaffolding were associated with positive performance on the test. It is a good example of using analytics to assess students’ attitudes while learning. However, it is just a study with a single cohort over one year in a single limited learning environment used in maths education. The intention is to work towards better support for tutors to intervene appropriately given the students’ affective states and their contexts.
Citation: Z.A. Pardos, R. S. J. D. Baker, M. O. C. Z. San Pedro, S. M. Gowda, S. M. Gowda, (2013). Affective states and state tests: investigating how affect throughout the school year predicts end of year learning outcomes, In: Third International Learning Analytics and Knowledge Conference (LAK13), 8-12 April, Leuven, Belgium. | Url: http://dl.acm.org/citation.cfm?id=2460296