Archives

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

Student intention and motivation are among the strongest predictors of persistence and completion in Massive Open Online Courses (MOOCs), but these factors are typically measured through fixed-response items that constrain student expression. We use natural language processing techniques to evaluate whether text analysis of open responses questions about motivation and utility value can offer additional capacity to predict persistence and completion over and above information obtained from fixed-response items. Compared to simple benchmarks based on demographics and dictionary-­based language analyses, we find that a machine learning prediction model can learn from unstructured text to predict which students will complete an online course. We show that the model performs well out­-of-­sample within a single course, and out­-of-­context in a related course, though not out-of-subject in an unrelated course. These results demonstrate the potential for natural language processing to contribute to predicting student success in MOOCs and other forms of open online learning.

Citation: Carly Robinson, Michael Yeomans, Justin Reich, Chris Hulleman and Hunter Gehlbach (2016). "Forecasting Student Achievement in MOOCs with Natural Language Processing". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

As the field of learning analytics continues to mature, there is a corresponding evolution and sophistication of the associated analytical methods and techniques. In this regard social network analysis (SNA) has emerged as one of the cornerstones of learning analytics methodologies. However, despite the noted importance of social networks for facilitating the learning process, it remains unclear how and to what extent such network measures are associated with specific learning outcomes. Motivated by Simmel's theory of social interactions and building on the argument that social centrality does not always imply benefits, this study aimed to further contribute to the understanding of the association between students' social centrality and their academic performance. The study reveals that learning analytics research drawing on SNA should incorporate both - descriptive and statistical methods to provide a more comprehensive and holistic understanding of a students' network position. In so doing researchers can undertake more nuanced and contextually salient inferences about learning in network settings. Specifically, we show how differences in the factors framing students' interactions within two instances of a MOOC affect the association between the three social network centrality measures (i.e., degree, closeness, and betweenness) and the final course outcome.

Citation: Srećko Joksimović, Areti Manataki, Dragan Gašević, Shane Dawson, Vitomir Kovanović and Inés Friss de Kereki (2016). "Translating network position into performance: Importance of Centrality in Different Network Configurations". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

The majority of the learning analytics research focuses on the prediction of course performance and modeling student behaviors with a focus on identifying students who are at risk of failing the course. Learning analytics should have a stronger focus on improving the quality of learning for all students, not only identifying at risk students. In order to do so, we need to understand what successful patterns look like when reflected in data and subsequently adjust the course design to avoid unsuccessful patterns and facilitate successful patterns.

However, when establishing these successful patterns, it is important to account for individual differences among students since previous research has shown that not all students engage with learning resources to the same extent. Regulation strategies seem to play an important role in explaining the different usage patterns students' display when using digital learning recourses. When learning analytics research incorporates contextualized data about student regulation strategies we are able to differentiate between students at a more granular level.
The current study examined if regulation strategies could account for differences in the use of various learning resources. It examines how students regulated their learning process and subsequently used the different learning resources throughout the course and established how this use contributes to course performance.

The results show that students with different regulation strategies use the learning resources to the same extent. However, the use of learning resources influences course performance differently for different groups of students. This paper recognizes the importance of contextualization of learning data resources with a broader set of indicators to understand the learning process. With our focus on differences between students, we strive for a shift within learning analytics from identifying at risk students towards a contribution of learning analytics in the educational design process and enhance the quality of learning; for all students.

Citation: Nynke Bos and Saskia Brand-Gruwel (2016). "Student differences in regulation strategies and their use of learning resources: implications for educational design". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

The proliferation of varied types of multi-user interactive surfaces (such as digital whiteboards, tabletops and tangible interfaces) is opening a new range of applications in face-to-face (f2f) contexts. They offer unique opportunities for Learning Analytics (LA) by facilitating multi-user sensemaking of automatically captured digital footprints of students' f2f interactions. This paper presents an analysis of current research exploring learning analytics associated with the use of surface devices. We use a framework to analyse our first-hand experiences, and the small number of related deployments according to four dimensions: the orchestration aspects involved; the phases of the pedagogical practice that are supported; the target actors; and the levels of iteration of the LA process. The contribution of the paper is two-fold: 1) a synthesis of conclusions that identify the degree of maturity, challenges and pedagogical opportunities of the existing applications of learning analytics and interactive surfaces; and 2) an analysis framework that can be used to characterise the design space of similar areas and LA applications.

Citation: Roberto Martinez-Maldonado, Bertrand Schneider, Sven Charleer, Simon Buckingham Shum, Joris Klerkx and Erik Duval (2016). "Interactive Surfaces and Learning Analytics: Data, Orchestration Aspects, Pedagogical Uses and Challenges". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: A: Learning | Polarity: | Sector: | Country:

Bayesian Knowledge Tracing (BKT) is one of the most popular knowledge inference models due to its interpretability and ability to infer student knowledge. A proper student modeling can help guide the behavior of a cognitive tutor system and provide insight to researchers on understanding how students learn. Using four different datasets we study the relationship between the error coming from adjusting the parameters and the difficulty index of the skills and the effect of the size of the dataset in this relationship.

Citation: Francesc Martori, Jordi Cuadros and Lucinio González-Sabaté (2016). "Studying the relationship between BKT adjustment error and the skill's difficulty index". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: A: Learning | Polarity: | Sector: | Country:

How can we best align learning analytics practices with disciplinary knowledge practices in order to support student learning? Although learning analytics itself is an interdisciplinary field, it tends to take a ˜one-size-fits-all' approach to the collection, measurement, and reporting of data, overlooking disciplinary knowledge practices. In line with a recent trend in higher education research, this paper considers the contribution of a realist sociology of education to the field of learning analytics, drawing on findings from recent student focus groups at an Australian university. It examines what learners say about their data needs with reference to organizing principles underlying knowledge practices within their disciplines. The key contribution of this paper is a framework that could be used as the basis for aligning the provision and/or use of data in relation to curriculum, pedagogy, and assessment with disciplinary knowledge practices. The framework extends recent research in Legitimation Code Theory, which understands disciplinary differences in terms of the principles that underpin knowledge-building. The preliminary analysis presented here both provides a tool for ensuring a fit between learning analytics practices and disciplinary practices and standards for achievement, and signals disciplinarity as an important consideration in learning analytics practices.

Citation: Jen McPherson, Huong Ly Tong, Scott J. Fatt and Danny Y.T. Liu (2016). "Student perspectives on data provision and use: Starting to unpack disciplinary differences". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: A: Learning | Polarity: | Sector: | Country:

When designing learning analytics tools for use by learners we
have an opportunity to provide tools that consider a particular
learner's situation and the learner herself. To afford actual impact
on learning, such tools have to be informed by theories of education.
Particularly, educational research shows that individual differences
play a significant role in explaining students' learning
process. However, limited empirical research in learning analytics
has investigated the role of theoretical constructs, such as motivational factors, that are underlying the observed differences between individuals. In this work, we conducted a field experiment
to examine the effect of three designed learning analytics visualizations on students' participation in online discussions in authentic course settings. Using hierarchical linear mixed models, our
results revealed that effects of visualizations on the quantity and
quality of messages posted by students with differences in
achievement goal orientations could either be positive or negative.
Our findings highlight the methodological importance of considering
individual differences and pose important implications for
future design and research of learning analytics visualizations.

Citation: Sanam Shirazi Beheshtiha, Marek Hatala, Dragan Gašević and Srećko Joksimović (2016). "The Role of Achievement Goal Orientations When Studying Effect of Learning Analytics Visualizations". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: A: Learning | Polarity: | Sector: | Country:

Evidence of the month: April 2016

This report documents the emerging uses of learning analytics in the United States, Australia and the United Kingdom. Through a series of 11 case studies, it presents an overview of the evidence currently available of the impact that analytics are having on teaching and learning – and highlights some of the opportunities for the UK higher education sector.

The report was commissioned by Jisc, which is currently working with 50 universities in the UK to set up a national learning analytics service for higher and further education.

Although learning analytics are still at a relatively early stage of development, this report finds that there is convincing evidence from early adopters that they will help to improve outcomes.

The case studies presented in the report provide a snapshot of some of the most prominent institution-level learning analytics initiatives worldwide. Together they provide evidence that:

  • researchers have demonstrated the validity of the predictive models used by learning analytics systems
  • interventions carried out with students have been effective
  • there are other benefits to taking a more data-driven approach to higher education provision

Extrapolating from current practice, in the UK and internationally, the authors anticipate that learning analytics could make significant contributions:

  • As a tool for quality assurance and quality improvement
  • As a tool for boosting retention rates
  • As a tool for assessing and acting upon differential outcomes among the student population
  • As an enabler for the development and introduction of adaptive learning

The full case studies, with references, are available at http://bit.ly/1WczdOn

Citation: Sclater, N., Peasgood, A. & Mullan, J. (2016). Learning Analytics in Higher Education: A Review of UK and International Practice. Jisc, Bristol. | Url: https://www.jisc.ac.uk/sites/default/files/learning-analytics-in-he-v2_0.pdf

Type: Evidence | Proposition: A: Learning | Polarity: | Sector: | Country:

Learning analytics makes extensive use of trace data from learners interacting with Learning Management Systems (LMS), and one of the most common uses is to derive an estimate of the time each learner spent on task, that is, engaging in particular learning activities. The authors of this paper note that while this approach is widely used, the details are often not given, and the consequences of choices made in generating these measures are not explored.

They present two experiments exploring different measures of time on task, one using data from a fully-online course at a Canadian university, and another using data from a blended course at an Australian university.

They found that time-on-task measures typically outperformed count measures for predicting student performance, but more importantly, identified that the precise choice of time-on-task measure "can have a significant effect on the overall fit of the model, its significance, and eventually on the interpretation of research findings".

Given their findings, they argue that there are implications for research in learning analytics: "Above all is the need for more caution when using time-on-task measures for building learning analytics models. Given that details of time-on-task estimation can potentially impact reported research findings, appropriately addressing time-on-task estimation becomes a critical part of standard research practice in the learning analytics community. This is particularly true in cases where time-on-task measures are not accompanied by additional measures such as counts of relevant activities."

They call for researchers to recognise the importance of time-on-task measures, for fuller detail to be given of the measures used, and for further investigation.

(This paper is an expanded version of an earlier paper at LAK15, which reported only on the data from the Canadian university: Kovanovic, Vitomir, Gasevic, Dragan, Dawson, Shane, Joksimovic, Srecko, Baker, Ryan S, & Hatala, Marek. (2015). Penetrating the black box of time-on-task estimation. Paper presented at LAK '15, Poughkeepsie, NY. DOI 10.1145/2723576.2723623.)

Citation: Kovanović, V., Gašević, D., Dawson, S., Joksimović, S., Baker, R. S., & Hatala, M. (2015) Does Time-on-task Estimation Matter? Implications on Validity of Learning Analytics Findings. | Url: https://epress.lib.uts.edu.au/journals/index.php/JLA/article/view/4501

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

This paper is the first empirical study linking learning design information for a substantial number of courses with Learning Management Systems (LMS)/Virtual Learning Environment (VLE) data and with learning performance.

The paper used hand-coded learning design data for 87 modules: the amount of time learners were expected to spend on each of seven types of learning design activity: assimilative; finding & handling information; communication; productive; experiential; interactive/adaptive; assessment. A cluster analysis of design of these modules suggested they fell in to four broad groups:: constructivist, assessment-driven, balanced-variety and social constructivist modules.

The authors found significant correlations between aspects of learning design and LMS/VLE usage data. Learners on the constructivist modules spent significantly more time on the LMS/VLE than learners on the other modules. The learning design decisions of teachers seemed to strongly influence how students were engaging with the LMS/VLE. Specifically, when more inquiry- or social constructivist learning activities were included in the learning design, learners tended to use the LMS/VLE more.

Finally, they explored the link with learning performance (completion and pass rates). They found no correlation between LMS/VLE activity and learning performance. However, there was a significant negative correlation between assimilative learning activity and learning performance, i.e. modules that had more time devoted to reading/watching/listening activity had significantly lower completion and pass rates.

The authors note that this was a small sample: full data was only available for 40 modules. However, it is a good first example of a study linking learning design with VLE/LMS data and with learning performance, and is evidence that the choices teachers make in learning design terms do have an effect on the learners, both in terms of VLE/LMS activity and learning performance.

(A further study with 157 modules is in press: Toetenel, Lisette and Rienties, Bart (2016). Analysing 157 Learning Designs using Learning Analytic approaches as a means to evaluate the impact of pedagogical decision-making. British Journal of Educational Technology (in press).)

 

Citation: Bart Rienties, Lisette Toetenel, and Annie Bryan. 2015. "Scaling up" learning design: impact of learning design activities on LMS behavior and performance. In Proceedings of the Fifth International Conference on Learning Analytics And Knowledge (LAK '15). ACM, New York, NY, USA, 315-319. | Url: http://dl.acm.org/citation.cfm?doid=2723576.2723600