Archives

Type: Evidence | Proposition: C: Uptake | Polarity: | Sector: | Country:

We designed and studied an innovative semantic visual learning analytics for orchestrating today's programming classes. The visual analytics integrates sources of learning activities by their content semantics. It automatically processs paper-based exams by associating sets of concepts to the exam questions. Results indicated the automatic concept extraction from exams were promising and could be a potential technological solution to address a real world issue. We also discovered that indexing effectiveness was especially prevalent for complex content by covering more comprehensive semantics. Subjective evaluation revealed that the dynamic concept indexing provided teachers with immediate feedback on producing more balanced exams.

Citation: Sharon Hsiao, Sesha Kumar Pandhalkudi Govindarajan and Yiling Lin (2016). "Semantic Visual Analytics for Today's Programming Courses". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

This paper explores the potential of analytics for improving accessibility of e-learning systems and for supporting disabled learners in their studies.

The work is set in the context of learning and academic analytics' focus on issues of retention. The definitions of disability and accessibility in e-learning are outlined and the implications of these for how disabled students needs may be modeled in learning analytics systems briefly discussed.

A comparative analysis of completion rates between disabled and non-disabled students in a large data set of 5 years of Open University modules is presented. The wide variation in comparative retention rates is noted and characterized. A key assertion of this paper are that learning analytics provide a set of tools for identifying and understanding such discrepancies and that analytics can be used to focus interventions that will improve the retention of disabled students in particular. A comparison of this quantitative approach with that of qualitative end of module surveys is made. How an approach called Critical Learning Paths, currently being researched, may be used to identify accessibility deficits in module components that are significantly impacting on the learning of disabled students is described.

An outline agenda for onward research currently being planned is given.

It is hoped that this paper will stimulate a wider interest in the potential benefits of learning analytics for higher educational institutions as they try to assure the accessibility of their e-learning and for the provision of support for disabled students.

Citation: Martyn Cooper, Rebecca Ferguson and Annika Wolff (2016). "What Can Analytics Contribute to Accessibility in e-Learning Systems and to Disabled Students' Learning?". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

Data-driven methods have previously been used in intelligent tutoring systems to improve student learning outcomes and predict student learning methods. We have been incorporating data-driven methods for feedback and problem selection into Deep Thought, a logic tutor where students practice constructing deductive logic proofs. These methods include data-driven hints and a data-driven mastery learning system (DDML) which calculates student proficiency based on rule scores weighted based on expert input in order to assign problem sets of appropriate difficulty.
In this latest study we have implemented our data-driven proficiency profiler (DDPP) into Deep Thought as a proof of concept. The DDPP determines student proficiency without expert involvement by comparing relevant student rule scores to previous students who behaved similarly in the tutor and successfully completed it. The results show that the DDPP did improve in performance with additional data and proved to be an effective proof of concept.

Citation: Behrooz Mostafavi and Tiffany Barnes (2016). "Data-driven Proficiency Profiling - Proof of Concept". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

Student intention and motivation are among the strongest predictors of persistence and completion in Massive Open Online Courses (MOOCs), but these factors are typically measured through fixed-response items that constrain student expression. We use natural language processing techniques to evaluate whether text analysis of open responses questions about motivation and utility value can offer additional capacity to predict persistence and completion over and above information obtained from fixed-response items. Compared to simple benchmarks based on demographics and dictionary-­based language analyses, we find that a machine learning prediction model can learn from unstructured text to predict which students will complete an online course. We show that the model performs well out­-of-­sample within a single course, and out­-of-­context in a related course, though not out-of-subject in an unrelated course. These results demonstrate the potential for natural language processing to contribute to predicting student success in MOOCs and other forms of open online learning.

Citation: Carly Robinson, Michael Yeomans, Justin Reich, Chris Hulleman and Hunter Gehlbach (2016). "Forecasting Student Achievement in MOOCs with Natural Language Processing". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

Systematic investigation of the collaborative problem solving process in open-ended, hands-on, physical computing design tasks requires a framework that highlights the main process features, stages and actions that then can be used to provide '˜meaningful' learning analytics data. This paper presents an analysis framework that can be used to identify crucial aspects of the collaborative problem solving process in practice-based learning activities. We deployed a mixed-methods approach that allowed us to generate an analysis framework that is theoretically robust, and generalizable. Additionally, the framework is grounded in data and hence applicable to real-life learning contexts. This paper presents how our framework was developed and how it can be used to analyse data. We argue for the value of effective analysis frameworks in the generation and presentation of learning analytics for practice-based learning activities.

Citation: Mutlu Cukurova, Katerina Avramides, Daniel Spikol, Rose Luckin and Manolis Mavrikis (2016). "An Analysis Framework for Collaborative Problem Solving in Practice-based Learning Activities: A Mixed-method Approach". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

As the field of learning analytics continues to mature, there is a corresponding evolution and sophistication of the associated analytical methods and techniques. In this regard social network analysis (SNA) has emerged as one of the cornerstones of learning analytics methodologies. However, despite the noted importance of social networks for facilitating the learning process, it remains unclear how and to what extent such network measures are associated with specific learning outcomes. Motivated by Simmel's theory of social interactions and building on the argument that social centrality does not always imply benefits, this study aimed to further contribute to the understanding of the association between students' social centrality and their academic performance. The study reveals that learning analytics research drawing on SNA should incorporate both - descriptive and statistical methods to provide a more comprehensive and holistic understanding of a students' network position. In so doing researchers can undertake more nuanced and contextually salient inferences about learning in network settings. Specifically, we show how differences in the factors framing students' interactions within two instances of a MOOC affect the association between the three social network centrality measures (i.e., degree, closeness, and betweenness) and the final course outcome.

Citation: Srećko Joksimović, Areti Manataki, Dragan Gašević, Shane Dawson, Vitomir Kovanović and Inés Friss de Kereki (2016). "Translating network position into performance: Importance of Centrality in Different Network Configurations". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

'Teaching analytics' is the application of learning analytics techniques to understand teaching and learning processes, and eventually enable supportive interventions. However, in the case of (often, half-improvised) teaching in face-to-face classrooms, such interventions would require first an understanding of what the teacher actually did, as the starting point for teacher reflection and inquiry. Currently, such teacher enactment characterization requires costly manual coding by researchers. This paper presents a case study exploring the potential of machine learning techniques to automatically extract teaching actions during classroom enactment, from five data sources collected using wearable sensors (eye-tracking, EEG, accelerometer, audio and video). Our results highlight the feasibility of this approach, with high levels of accuracy in determining the social plane of interaction (90%, Kappa=0.8). The reliable detection of concrete teaching activity (e.g., explanation vs. questioning) accurately still remains challenging (67%, Kappa=0.56), a fact that will prompt further research on multimodal features and models for teaching activity extraction, as well as the collection of a larger multimodal dataset to improve the accuracy and generalizability of these methods.

Citation: Luis Pablo Prieto, Kshitij Sharma, María Jesús Rodríguez-Triana and Pierre Dillenbourg (2016). "Teaching Analytics: Towards Automatic Extraction of Orchestration Graphs Using Wearable Sensors". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

The majority of the learning analytics research focuses on the prediction of course performance and modeling student behaviors with a focus on identifying students who are at risk of failing the course. Learning analytics should have a stronger focus on improving the quality of learning for all students, not only identifying at risk students. In order to do so, we need to understand what successful patterns look like when reflected in data and subsequently adjust the course design to avoid unsuccessful patterns and facilitate successful patterns.

However, when establishing these successful patterns, it is important to account for individual differences among students since previous research has shown that not all students engage with learning resources to the same extent. Regulation strategies seem to play an important role in explaining the different usage patterns students' display when using digital learning recourses. When learning analytics research incorporates contextualized data about student regulation strategies we are able to differentiate between students at a more granular level.
The current study examined if regulation strategies could account for differences in the use of various learning resources. It examines how students regulated their learning process and subsequently used the different learning resources throughout the course and established how this use contributes to course performance.

The results show that students with different regulation strategies use the learning resources to the same extent. However, the use of learning resources influences course performance differently for different groups of students. This paper recognizes the importance of contextualization of learning data resources with a broader set of indicators to understand the learning process. With our focus on differences between students, we strive for a shift within learning analytics from identifying at risk students towards a contribution of learning analytics in the educational design process and enhance the quality of learning; for all students.

Citation: Nynke Bos and Saskia Brand-Gruwel (2016). "Student differences in regulation strategies and their use of learning resources: implications for educational design". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

The proliferation of varied types of multi-user interactive surfaces (such as digital whiteboards, tabletops and tangible interfaces) is opening a new range of applications in face-to-face (f2f) contexts. They offer unique opportunities for Learning Analytics (LA) by facilitating multi-user sensemaking of automatically captured digital footprints of students' f2f interactions. This paper presents an analysis of current research exploring learning analytics associated with the use of surface devices. We use a framework to analyse our first-hand experiences, and the small number of related deployments according to four dimensions: the orchestration aspects involved; the phases of the pedagogical practice that are supported; the target actors; and the levels of iteration of the LA process. The contribution of the paper is two-fold: 1) a synthesis of conclusions that identify the degree of maturity, challenges and pedagogical opportunities of the existing applications of learning analytics and interactive surfaces; and 2) an analysis framework that can be used to characterise the design space of similar areas and LA applications.

Citation: Roberto Martinez-Maldonado, Bertrand Schneider, Sven Charleer, Simon Buckingham Shum, Joris Klerkx and Erik Duval (2016). "Interactive Surfaces and Learning Analytics: Data, Orchestration Aspects, Pedagogical Uses and Challenges". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: A: Learning | Polarity: | Sector: | Country:

The affordances of learning analytics (LA) are being increasingly harnessed to enhance 21st century (21C) pedagogy and learning. Relatively rare, however, are use cases and empirically based understandings of students' actual experiences with LA tools and environments aimed at fostering 21C literacies, especially in secondary schooling and Asian education contexts. This paper addresses this knowledge gap by presenting 1) a first iteration design of a computer-supported collaborative critical reading and LA environment and its 16-week implementation in a Singapore high school; and 2) foregrounding students' quantitative and qualitative accounts of the benefits and problematics associated with this learning innovation. We focus the analytic lens on the LA dashboard components that provided visualizations of students' reading achievement, 21C learning dispositions, critical literacy competencies and social learning network positioning within the class. The paper aims to provide insights into the potentialities, paradoxes and pathways forward for designing LA that take into consideration the voices of learners as critical stakeholders.

Citation: Jennifer Pei-Ling Tan, Simon Yang, Elizabeth Koh and Christin Jonathan (2016). "Fostering 21st century literacies through a collaborative critical reading and learning analytics environment: User-perceived benefits and problematics". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.