In the last few years, there has been a growing interest in learning analytics (LA) in technology-enhanced learning (TEL). Generally, LA deals with the development of methods that harness educational data sets to support the learning process. Recently, the concept of open learning analytics (OLA) has received a great deal of attention from LA community, due to the growing demand for self-organized, networked, and lifelong learning opportunities. A key challenge in OLA is to follow a personalized and goal-oriented LA model that tailors the LA task to the needs and goals of multiple stakeholders. Current implementations of LA rely on a predefined set of questions and indicators. There is, however, a need to adopt a personalized LA approach that engages end users in the indicator definition process by supporting them in setting goals, posing questions, and self-defining the indicators that help them achieve their goals. In this paper, we address the challenge of personalized LA and present the conceptual, design, and implementation details of a rule-based indicator definition tool to support flexible definition and dynamic generation of indicators to meet the needs of different stakeholders with diverse goals and questions in the LA exercise.
In recent years, Massive Open Online Courses (MOOCs) have become a phenomenon offering the possibility to teach thousands of participants simultaneously. In the same time the platforms used to deliver these courses are still in their fledgling stages. While course content and didactics of those massive courses are the primary key factors for the success of courses, still an smart platform may increase or decrease the learners experience and his learning outcome. This paper at hand proposes the usage of an A/B testing framework that is able to be used within an microservice architecture to validate hypotheses about how learners use the platform and to enable data-driven decisions about new features and settings. To evaluate this framework three new features (Onboarding Tour, Reminder Mails and a Pinboard Digest) have been identified based on a user survey. They have been implemented and introduced on two large MOOC platforms and their influence on the learners behavior have been measured. Finally this paper proposes a data driven decision workflow for the introduction of new features and settings on e-learning platforms.
This article concerns the evaluation of the social capital of the European teachers participating in the eTwinning Portal activities, run by European Schoolnet and characterised by more than 160,000 registered teachers from 35 countries, involved in more than 19,000 projects (2010). This evaluation has been performed by using the Social Network Analysis approach.
The authors found that some correlations can be found "between social network analysis measures like degree and betweenness centrality as well as the local clustering coefficient, activity statistics about usage of eTwinning and the quality management of European Schoolnet".
For the analysis of eTwinning network data, three Learning Analytics tools have been developed:
- eVa (eTwinning Network Visualization and Analysis), a network visualization and simple analysis tool.
- CAfe (Competence Analyst for eTwinning), an SNA-based competence management and teachers' self-monitoring tool.
- AHTC (Ad Hoc Transient Communities) services, which involve users into question-answer activities on the eTwinning Portal.
This study used main path analysis on Wikiversity data to identify core topics in the areas of biology and electrical engineering. The results could be useful for the Wikiversity community as a whole. They point to the need for better coordination of the disparate topics in a domain. They could be used to orient participants by showing them the importance of the topic they are working on, and by revealing important reference points to other core topics in the field. Seeing the main paths could prompt a new contributor to add to an existing strand of knowledge development or to start a new peripheral one. Advanced participants in the community might benefit from the analysis as an historical reconstruction of the shared knowledge-building process, comparing their own visions and goals with the actual knowledge development of the community and identifying topical gaps. The paper presents analysis steps in detail, including a description of a tool environment that is designed for flexible use by non‐computer experts.
Wearable devices and ambient sensors can monitor a growing number of aspects of daily life and work. Learning by reflection is essential for today’s dynamic work environments, as employees have to adapt their behavior according to their experiences. Designing tools supporting reflective learning at work by turning context information collected through sensors into learning content is shown to support reflective learning in two case studies related to care staff in care home and voluntary crisis workers.
This paper seeks to unite learning analytics and action research. It investigates how the multitude of questions that arise during technology-enhanced teaching and learning can be systematically mapped to sets of indicators. It identifies questions that are not yet supported and proposes indicators that offer a high potential for positively influencing teachers’ didactical considerations. The investigation shows that many questions from teachers cannot be answered using the research tools that are currently available. Furthermore,
few learning analytics studies measure impact. The paper identifies effects that learning analytics could have on teaching and discusses how this could be evaluated.
Comment from Martyn Cooper: This is a meta-study and looks at a wide range of learning analytics tools. This paper highlights that the impact of learning analytics tools has not been evaluated. It suggests how Action Research could contribute to addressing this. It is suggested that at this stage the inclusion of this paper in an evidence hub would be useful because it highlights the lack of evidence.