Systematic investigation of the collaborative problem solving process in open-ended, hands-on, physical computing design tasks requires a framework that highlights the main process features, stages and actions that then can be used to provide '˜meaningful' learning analytics data. This paper presents an analysis framework that can be used to identify crucial aspects of the collaborative problem solving process in practice-based learning activities. We deployed a mixed-methods approach that allowed us to generate an analysis framework that is theoretically robust, and generalizable. Additionally, the framework is grounded in data and hence applicable to real-life learning contexts. This paper presents how our framework was developed and how it can be used to analyse data. We argue for the value of effective analysis frameworks in the generation and presentation of learning analytics for practice-based learning activities.
'Teaching analytics' is the application of learning analytics techniques to understand teaching and learning processes, and eventually enable supportive interventions. However, in the case of (often, half-improvised) teaching in face-to-face classrooms, such interventions would require first an understanding of what the teacher actually did, as the starting point for teacher reflection and inquiry. Currently, such teacher enactment characterization requires costly manual coding by researchers. This paper presents a case study exploring the potential of machine learning techniques to automatically extract teaching actions during classroom enactment, from five data sources collected using wearable sensors (eye-tracking, EEG, accelerometer, audio and video). Our results highlight the feasibility of this approach, with high levels of accuracy in determining the social plane of interaction (90%, Kappa=0.8). The reliable detection of concrete teaching activity (e.g., explanation vs. questioning) accurately still remains challenging (67%, Kappa=0.56), a fact that will prompt further research on multimodal features and models for teaching activity extraction, as well as the collection of a larger multimodal dataset to improve the accuracy and generalizability of these methods.
The affordances of learning analytics (LA) are being increasingly harnessed to enhance 21st century (21C) pedagogy and learning. Relatively rare, however, are use cases and empirically based understandings of students' actual experiences with LA tools and environments aimed at fostering 21C literacies, especially in secondary schooling and Asian education contexts. This paper addresses this knowledge gap by presenting 1) a first iteration design of a computer-supported collaborative critical reading and LA environment and its 16-week implementation in a Singapore high school; and 2) foregrounding students' quantitative and qualitative accounts of the benefits and problematics associated with this learning innovation. We focus the analytic lens on the LA dashboard components that provided visualizations of students' reading achievement, 21C learning dispositions, critical literacy competencies and social learning network positioning within the class. The paper aims to provide insights into the potentialities, paradoxes and pathways forward for designing LA that take into consideration the voices of learners as critical stakeholders.
Evidence of the month: February 2015
The field of learning analytics is working to define frameworks that can structure the legal and ethical issues that scholars and practitioners have to take into account when planning and applying analytics to their learning contexts. However, the main focus of this work to date has been higher education. This paper reflects on the need to extend these ethical frameworks to cover other approaches to learning analytics; specifically, small-scale classroom-oriented approaches designed to support teachers in their practice.
This reflection is based on three studies in which a teacher-led learning analytics approach was employed in higher education and primary school contexts. The paper describes the ethical issues that emerged in these learning scenarios, and discusses them with regard to: the overall learning analytics approach, the particular solution to learning analytics adopted, and the educational contexts where the analytics were applied.
This work is presented as a first step towards the wider objective of providing a more comprehensive and adapted ethical framework to learning analytics that is able to address the needs of different learning analytics approaches and educational contexts.
Evidence of the month: October 2015
Uruguay was the first country in the world to meet the target of one laptop per child. The programme that achieved this, Plan Ceibal, covers all public schools in Uruguay, and around 700,000 students and teachers now have their own device. The aim is to use this technology to support and develop learning and teaching.
Plan Ceibal also operates and integrates national-scale databases, populated with data from a range of management and educational activities. These include demographic data, performance data and system usage data.
This conference paper sets out some of the steps that will be required to implement a learning analytics strategy within Plan Ceibal. This has the potential to support and develop current technology and learning educational policies across the country.
The UK's Department of Education (DfE) has been using big data to suggest that becoming an academy (a type of school currently favoured by the government) improves pupils' results on the national standardised tests in English and Maths (SATs).
However, the UK Statistics Authority (UKSA) has cast doubts on this analysis. Differences in results gains may not be connected with academy status. Instead, the differences appear to be part of a general trend for schools in England and Wales. Overall, those with poor SATs results in 2012-13 have closed the gap on those that previously had better results.
Non-academies that started with the same – generally poor – test results as sponsored academies in 2013 actually registered faster improvements in 2014.
The UKSA said that the DfE should in future state that the data as presented could not be used – by politicians or by others – to imply a causal link between academy status and improvements in test results.
The paper deals with two key ideas that could help schools graduate more students on time. The first is to produce a ranked list that orders students according to their risk of not graduating on time. The second is to predict when they'll go off track, to help schools plan the urgency of the interventions. Both of these predictions are useful in identification and prioritization of students at risk and enable schools to target interventions. The eventual goal of these efforts is to focus the limited resources of schools to increase graduation rates.
The results of this study have helped a school district systematically to adjust analytical methods as they continue to build a universal EWI (early-warning indicator) system. The district is also highly interested in the web-based dashboard application that was developed.
SABER-Education Management Information Systems (SABER-EMIS) aims to help countries improve data collection, data and system management, and data use in decision making, with the aim of improving different elements of the education system and contributing to the end goal of improving learning for all children and youth.
SABER-EMIS assesses the effectiveness of an EMIS with the aim of informing policy dialogue and helping countries monitor overall progress related to education inputs, processes and outcomes. It does this by:
- administering a set of tools and gathering qualitative and quantitative data (validated by legal documents) in order to assess the soundness of an information system.
- classifying and analysing existing education management information systems based on four policy areas (below).
- producing country reports and other knowledge products with the intention of improving a country’s education system.
Four key policy areas are assessed: enabling environment, system soundness, quality data and utilisation for decision making.
This blog post notes that, in May 2015, the World Bank was concluding an analysis of over 800 policy documents related to the use of information and communication technologies (ICTs) in education from high, middle and low income countries around the world in order to gain insight into key themes of common interest to policymakers. Theme 6 relates to learning analytics, specifically to 'supporting the collection, processing analysis and dissemination of education-related data to relevant stakeholders'. Initial policies tend to focus on the collection of basic enrolment data. As ICT use becomes more prevalent, more systematic and holistic views of data collection, processing analysis and dissemination emerge.
See also the SABER (systems approach for better education results) work on education management information systems (EMIS) at http://saber.worldbank.org/index.cfm?indx=8&tb=2
This study assesses the potential of natural language processing to predict human ratings of essay quality. Past studies have demonstrated that indicators such as word frequency are significant predictors of human judgments of essay quality. This study extends earlier work by using a larger data sample and an expanded set of indicators. These indicators increased accuracy and, more importantly, offer a way of providing more meaningful feedback in the context of a writing tutoring system. Students can be taught strategies to help them to generate more text (for example, strategies to facilitate free writing, planning, drafting, and elaboration), and such strategies improve their essay quality. The higher quality essays included more phrasal constructions – they were more likely to include examples and make contrasts between ideas. In their concluding paragraphs, their authors were more likely to include concluding statements (i.e., in conclusion), statements of fact (i.e., it is), and reasons (i.e., because). The Writing Pal is an intelligent tutoring system that provides writing strategy training that is aligned with these findings. It teaches students strategies for drafting and improving body and conclusion paragraphs. For instance, students are taught how to identify and edit evidence in the body of the essay that is too speculative rather than fact based and objective. Similarly, students are taught to write conclusions that summarise key arguments without presenting additional or new evidence.
The focus of this paper is on natural language processing, rather than on the gains of learners. However, it implies that using the findings of the study to inform an intelligent tutoring system has improved essay quality.