This paper explores a longitudinal approach to combining engagement, performance and social connectivity data from a MOOC using the framework of exponential random graph models (ERGMs). The idea is to model the social network in the discussion forum in a given week not only using performance (assignment scores) and overall engagement (lecture and discussion views) covariates within that week, but also on the same person-level covariates from adjacent previous and subsequent weeks. We find that over all eight weekly sessions, the social networks constructed from the forum interactions are relatively sparse and lack the tendency for preferential attachment. By analyzing data from the second week, we also find that individuals with higher performance scores from current, previous, and future weeks tend to be more connected in social network. Engagement with lectures had significant but sometimes puzzling effects on social connectivity. However, the relationships between social connectivity, performance, and engagement weakened over time, and results were not stable across weeks.
With the aim of better scaffolding discussion to improve learning in a MOOC context, this work investigates what kinds of discussion behaviors contribute to learning. We explored whether engaging in higher-order thinking behaviors results in more learning than paying general or focused attention to course materials. In order to evaluate whether to attribute the effect to engagement in the associated behaviors versus persistent characteristics of the students, we adopted two approaches. First, we used propensity score matching to pair students who exhibit a similar level of involvement in other course activities. Second, we explored individual variation in engagement in higher-order thinking behaviors across weeks. The results of both analyses support the attribution of the effect to the behavioral interpretation. A further analysis using LDA applied to course materials suggests that more social oriented topics triggered richer discussion than more biopsychology oriented topics.
Computerized educational systems are increasingly provided as open online services which provide adaptive personalized learning experience. To fully exploit potential of such systems, it is necessary to thoroughly evaluate different design choices. However, both openness and adaptivity make proper evaluation difficult. We provide a detailed report on evaluation of an online system for adaptive practice of geography, and use this case study to highlight methodological issues with evaluation of open online learning systems, particularly attrition bias. To facilitate evaluation of learning, we propose to use randomized reference questions. We illustrate application of survival analysis and learning curves for declarative knowledge. The result provide an interesting insight into the impact of adaptivity on learner behaviour and learning.
This study used main path analysis on Wikiversity data to identify core topics in the areas of biology and electrical engineering. The results could be useful for the Wikiversity community as a whole. They point to the need for better coordination of the disparate topics in a domain. They could be used to orient participants by showing them the importance of the topic they are working on, and by revealing important reference points to other core topics in the field. Seeing the main paths could prompt a new contributor to add to an existing strand of knowledge development or to start a new peripheral one. Advanced participants in the community might benefit from the analysis as an historical reconstruction of the shared knowledge-building process, comparing their own visions and goals with the actual knowledge development of the community and identifying topical gaps. The paper presents analysis steps in detail, including a description of a tool environment that is designed for flexible use by non‐computer experts.
Editorial for the first issue of the Journal for Learning Analytics (JLA), a peer‐reviewed, open‐access journal published by the Society for Learning Analytics Research (SoLAR). This is the first journal dedicated to research investigating the challenges of collecting, analyzing, and reporting data with the specific intent of understanding and improving learning. The journal provides a space for promoting both research and practice. It represents and reflects the diversity of learning analytics work in formal and informal education contexts: from K–12, vocational and higher education, and workplace settings.
Evidence of the Month: April 2015
This paper reports findings from a study of MOOC log file data relating to a large University of Melbourne MOOC that ran in 2013. The study investigated a series of hypotheses: (i) there is skill involved in using forums to learn (ii) MOOC participants use forums differently as they progress from novice to expert in this skill (iii) this progression is reflected in log file data (iv) log file data can be used to measure a learner's skill in learning through forums. The study provides provisional support for each of these hypotheses.
The paper also sets out how skill in the use of forums for learners can develop. The proposed framework identifies how learners at each level are likely to view knowledge, how they are likely to view forums in the context of learning, and how they are likely to use MOOC forums. The framework has five levels:
- Level 1: Novice, dependent learning
- Level 2: Beginning, independent learner
- Level 3, Proficient, collegial learner
- Level 4, Competent , collaborative learner
- Level 5 Expert, learning leader.
For example, at Level 1: 'Learning is about consuming stable knowledge in a domain, comprised mainly of cognitive understanding or skill; seeks efficient transfer from authoritative or reputable sources; follows procedural guidance from teachers, relinquishing responsibility for learning process; calibrates performance on formal assessments and accepts standards inherent in them. Forums are essentially social adjuncts to courses; the information is possibly unreliable and misleading. Never visits forums.'
The relationship between achievement levels on the course studied and level on this scale was apparently strong. Of those rated at expert level 5, 78% received at least a pass score and 67% a distinction score. However, of those rated at level 1, less than 1% received a pass score.