Tag Archives: Measurement

Type: Evidence | Proposition: A: Learning | Polarity: | Sector: | Country:

In this paper, the authors describe a conceptual model for the analysis of informal learning in online social networks for workers, and in particular for health professionals. This work sector has been selected due to its particular characteristics, considering that Staying up-to-date and delivering best evidence-based care is crucial for these professionals, and that they need to be lifelong learners as medical knowledge expands and changes rapidly.

In this environment, "Online social networking (OSN) provides a new way for health professionals to communicate, collaborate and share ideas with each other for informal learning on a massive scale. It has important implications for ongoing efforts to support Continuing Professional Development (CPD) in the health professions. However, the challenge of analysing the data generated in OSNs makes it difficult to understand whether and how they are useful for CPD".

The paper explores three approaches for the analysis of OSN: Content Analysis (CA), Social Network Analysis (SNA) and the most innovative one, the Social Learning Analytics (SLA), as a sub-field of Learning Analytics.

The described conceptual model tries to merge the CA and SNA approaches, considering also a survey to evaluate the learning outcome, instead of real clinical data. This model is divided in three sections:

  • Learning interactions: is focused on studying the structure of interactions and the level of and influential factors associated with engagement.
  • Learning process: involves the examination of cognitive presence, social presence, facilitation presence and learning presence.
  • Learning outcome: measures the  social value for online community members, in terms of valued activities, gained knowledge, changed practice, improved performance and redefined success.
Citation: Li, X; Gray, K; Chang, S; Elliott, K; Barnett, S, A conceptual model for analysing informal learning in online social networks for health professionals., Stud Health Technol Inform, 2014, 204 pp. 80 - 85 | Url: https://minerva-access.unimelb.edu.au/handle/11343/43110

Type: Evidence | Proposition: A: Learning | Polarity: | Sector: | Country:

Learning analytics makes extensive use of trace data from learners interacting with Learning Management Systems (LMS), and one of the most common uses is to derive an estimate of the time each learner spent on task, that is, engaging in particular learning activities. The authors of this paper note that while this approach is widely used, the details are often not given, and the consequences of choices made in generating these measures are not explored.

They present two experiments exploring different measures of time on task, one using data from a fully-online course at a Canadian university, and another using data from a blended course at an Australian university.

They found that time-on-task measures typically outperformed count measures for predicting student performance, but more importantly, identified that the precise choice of time-on-task measure "can have a significant effect on the overall fit of the model, its significance, and eventually on the interpretation of research findings".

Given their findings, they argue that there are implications for research in learning analytics: "Above all is the need for more caution when using time-on-task measures for building learning analytics models. Given that details of time-on-task estimation can potentially impact reported research findings, appropriately addressing time-on-task estimation becomes a critical part of standard research practice in the learning analytics community. This is particularly true in cases where time-on-task measures are not accompanied by additional measures such as counts of relevant activities."

They call for researchers to recognise the importance of time-on-task measures, for fuller detail to be given of the measures used, and for further investigation.

(This paper is an expanded version of an earlier paper at LAK15, which reported only on the data from the Canadian university: Kovanovic, Vitomir, Gasevic, Dragan, Dawson, Shane, Joksimovic, Srecko, Baker, Ryan S, & Hatala, Marek. (2015). Penetrating the black box of time-on-task estimation. Paper presented at LAK '15, Poughkeepsie, NY. DOI 10.1145/2723576.2723623.)

Citation: Kovanović, V., Gašević, D., Dawson, S., Joksimović, S., Baker, R. S., & Hatala, M. (2015) Does Time-on-task Estimation Matter? Implications on Validity of Learning Analytics Findings. | Url: https://epress.lib.uts.edu.au/journals/index.php/JLA/article/view/4501

Type: Evidence | Proposition: C: Uptake | Polarity: | Sector: | Country:

There is a large body of research suggesting that the amount of time spent on learning can improve the quality of learning, as represented by academic performance. The widespread adoption of learning technologies such as learning management systems (LMSs) has resulted in large amounts of data about student learning being readily accessible to educational researchers. One common use of this data is to measure time that students have spent on different learning tasks. Given that LMS systems typically only capture times when students executed various actions, time-on-task measures can currently only be estimates.

This paper takes five learning analytics models of student performance, and examines the consequences of using 15 different time-on-task estimation strategies. It finds that choice of estimation strategy can have a significant effect on the overall fit of a model, its significance, and the interpretation of research findings.

The paper concludes

  1. The learning analytics community should recognize the importance of time-on-task estimation and the role it plays in the quality of analytical models and their interpretation
  2. Publications should explain in detail how time on task has been estimated, in order to support the development of open, replicable and reproducible research
  3. This area should be investigated further in order to provide a set of standards and common practices for the conduct of learning analytics research.

This was selected as the best paper at the Learning Analytics and Knowledge conference 2015.

Citation: Kovanovic, Vitomir, Gasevic, Dragan, Dawson, Shane, Joksimovic, Srecko, Baker, Ryan S, & Hatala, Marek. (2015). Penetrating the black box of time-on-task estimation. Paper presented at LAK '15, Poughkeepsie, NY. DOI 10.1145/2723576.2723623 | Url: http://dl.acm.org/citation.cfm?id=2723576