Archives

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

When used effectively, reflective writing tasks can deepen learners' understanding of key concepts, help them critically appraise their developing professional identity, and build qualities for lifelong learning. As such, reflecting writing is attracting substantial interest from universities concerned with experiential learning, reflective practice, and developing a holistic conception of the learner. However, reflective writing is for many students a novel genre to compose in, and tutors may be inexperienced in its assessment. While these conditions set a challenging context for automated solutions, natural language processing may also help address the challenge of providing real time, formative feedback on draft writing. This paper reports progress in designing a writing analytics application, detailing the methodology by which informally expressed rubrics are modelled as formal rhetorical patterns, a capability delivered by a novel web application. This has been through iterative evaluation on an independently human-annotated corpus, showing improvements from the first to second version. We conclude by discussing the reasons why classifying reflective writing has proven complex, and reflect on the design processes enabling work across disciplinary boundaries to develop the prototype to its current state.

Citation: Simon Buckingham Shum, Agnes Sandor, Rosalie Goldsmith, Xiaolong Wang, Randall Bass and Mindy McWilliams (2016). "Reflecting on Reflective Writing Analytics: Assessment Challenges and Iterative Evaluation of a Prototype Tool". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

In this paper we present a learning analytics conceptual framework that supports enquiry-based evaluation of learning designs. The dimensions of the proposed framework emerged from a review of existing analytics tools, the analysis of interviews with teachers, and user case profiles to understand what types of analytics would be useful in evaluating a learning activity in relation to pedagogical intent. The proposed framework incorporates various types of analytics, with the teacher playing a key role in bringing context to the analysis and making decisions on the feedback provided to students as well as the scaffolding and adaptation of the learning design. The framework consists of five dimensions: temporal analytics, tool-specific analytics, cohort dynamics, comparative analytics and contingency. Specific metrics and visualisations are also defined for each dimension of the conceptual framework. Finally the development of a tool that partially implements the conceptual framework is discussed.

Citation: Aneesha Bakharia, Linda Corrin, Paula de Barba, Gregor Kennedy, Dragan Gasevic, Raoul Mulder, David Williams, Shane Dawson and Lori Lockyer (2016). "A Conceptual Framework linking Learning Design with Learning Analytics". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

The prevalence of early alert systems (EAS) at tertiary institutions is increasing. These systems are designed to assist with targeted student support in order to improve student retention. They also require considerable human and capital resources to implement, with significant costs involved. It is therefore an imperative that the systems can demonstrate quantifiable financial benefits to the institution.. The purpose of this paper is to report on the financial implications of implementing an EAS at an Australian university as a case study.. The case study institution implemented an EAS in 2011 using data generated from a data warehouse. The data set is comprised of 16,124 students enrolled between 2011 and 2013. Using a treatment effects approach, the study found that the cost of a student discontinuing was on average $4,687. Students identified by the EAS remained enrolled for longer, with the institution benefiting with approximately an additional $4,004 in revenue per student. Within the schools of the institution, all schools had a significant positive effect associated with the EAS. Finally, the EAS showed significant value to the institution regardless of when the student was identified. The results indicate that EAS had significant financial benefits to this institution and that the benefits extended to the entire institution beyond the first year of enrolment.

Citation: Scott Harrison, Rene Villano, Grace Lynch and George Chen (2016). "Measuring financial implications of an early alert system". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

The proliferation of varied types of multi-user interactive surfaces (such as digital whiteboards, tabletops and tangible interfaces) is opening a new range of applications in face-to-face (f2f) contexts. They offer unique opportunities for Learning Analytics (LA) by facilitating multi-user sensemaking of automatically captured digital footprints of students' f2f interactions. This paper presents an analysis of current research exploring learning analytics associated with the use of surface devices. We use a framework to analyse our first-hand experiences, and the small number of related deployments according to four dimensions: the orchestration aspects involved; the phases of the pedagogical practice that are supported; the target actors; and the levels of iteration of the LA process. The contribution of the paper is two-fold: 1) a synthesis of conclusions that identify the degree of maturity, challenges and pedagogical opportunities of the existing applications of learning analytics and interactive surfaces; and 2) an analysis framework that can be used to characterise the design space of similar areas and LA applications.

Citation: Roberto Martinez-Maldonado, Bertrand Schneider, Sven Charleer, Simon Buckingham Shum, Joris Klerkx and Erik Duval (2016). "Interactive Surfaces and Learning Analytics: Data, Orchestration Aspects, Pedagogical Uses and Challenges". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: A: Learning | Polarity: | Sector: | Country:

How can we best align learning analytics practices with disciplinary knowledge practices in order to support student learning? Although learning analytics itself is an interdisciplinary field, it tends to take a ˜one-size-fits-all' approach to the collection, measurement, and reporting of data, overlooking disciplinary knowledge practices. In line with a recent trend in higher education research, this paper considers the contribution of a realist sociology of education to the field of learning analytics, drawing on findings from recent student focus groups at an Australian university. It examines what learners say about their data needs with reference to organizing principles underlying knowledge practices within their disciplines. The key contribution of this paper is a framework that could be used as the basis for aligning the provision and/or use of data in relation to curriculum, pedagogy, and assessment with disciplinary knowledge practices. The framework extends recent research in Legitimation Code Theory, which understands disciplinary differences in terms of the principles that underpin knowledge-building. The preliminary analysis presented here both provides a tool for ensuring a fit between learning analytics practices and disciplinary practices and standards for achievement, and signals disciplinarity as an important consideration in learning analytics practices.

Citation: Jen McPherson, Huong Ly Tong, Scott J. Fatt and Danny Y.T. Liu (2016). "Student perspectives on data provision and use: Starting to unpack disciplinary differences". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: A: Learning | Polarity: | Sector: | Country:

In this paper, the authors describe a conceptual model for the analysis of informal learning in online social networks for workers, and in particular for health professionals. This work sector has been selected due to its particular characteristics, considering that Staying up-to-date and delivering best evidence-based care is crucial for these professionals, and that they need to be lifelong learners as medical knowledge expands and changes rapidly.

In this environment, "Online social networking (OSN) provides a new way for health professionals to communicate, collaborate and share ideas with each other for informal learning on a massive scale. It has important implications for ongoing efforts to support Continuing Professional Development (CPD) in the health professions. However, the challenge of analysing the data generated in OSNs makes it difficult to understand whether and how they are useful for CPD".

The paper explores three approaches for the analysis of OSN: Content Analysis (CA), Social Network Analysis (SNA) and the most innovative one, the Social Learning Analytics (SLA), as a sub-field of Learning Analytics.

The described conceptual model tries to merge the CA and SNA approaches, considering also a survey to evaluate the learning outcome, instead of real clinical data. This model is divided in three sections:

  • Learning interactions: is focused on studying the structure of interactions and the level of and influential factors associated with engagement.
  • Learning process: involves the examination of cognitive presence, social presence, facilitation presence and learning presence.
  • Learning outcome: measures the  social value for online community members, in terms of valued activities, gained knowledge, changed practice, improved performance and redefined success.
Citation: Li, X; Gray, K; Chang, S; Elliott, K; Barnett, S, A conceptual model for analysing informal learning in online social networks for health professionals., Stud Health Technol Inform, 2014, 204 pp. 80 - 85 | Url: https://minerva-access.unimelb.edu.au/handle/11343/43110

Type: Evidence | Proposition: D: Ethics | Polarity: | Sector: | Country:

The paper defines privacy as the regulation of how personal digital information is being observed by the self or distributed to other observers. It defines ethics as the systematization of correct and incorrect behaviour in virtual spaces according to all stakeholders.

Principles identified within this paper are:

  • transparency
  • student control over data
  • rights of access
  • accountability and assessment

The authors argue that, by discussing the various aspects within each of these categories, institutions have mechanisms to assess their initiatives and achieve compliance with current laws and regulations as well as with socially derived requirements.

In terms of the Evidence Hub proposition 'Learning analytics are used in an ethical way', the need for a paper such as this to be written implies that appropriate measures are being developed but are not yet in place. As the authors note, 'The ethical and privacy issues derived from these scenarios are not properly addressed.'

Summary provided within the paper:

What is already known about this topic
• Learning analytics offers the possibility of collecting detailed information about how students learn.
• The ethical and privacy issues derived from these scenarios are not properly addressed.
• There is a need to clarify how these issues must be addressed from the early stages of the deployment of a learning analytics scenario.
What this paper adds
• An account of how the main legislations are advancing in the general area of privacy.
• A comparison of how other disciplines such as medicine have dealt with privacy issues when collecting private information.
• The description of a group of practical principles in which to include all the ethical and privacy-related issues present when deploying a learning analytics application.
Implications for practice and/or policy
• Designers may now take into account these principles to guide the implementation of learning analytics platforms.
• Students are now aware of the different categories of issues that must be addressed by applications collecting data while they learn.
• Instructors now have a more detailed account of the issues to address when adopting learning analytics techniques.

Citation: Pardo, A., & Siemens, G. (2014). Ethical and privacy principles for learning analytics. British Journal of Educational Technology, 45(3), 438-450. | Url: https://www.researchgate.net/profile/Abelardo_Pardo/publications?pubType=article

Type: Evidence | Proposition: C: Uptake | Polarity: | Sector: | Country:

This study examined the extent to which instructional conditions influence the prediction of academic success in nine undergraduate courses offered in a blended learning model (n=4134).

The results of the study indicate that using generalized models to predict academic success prediction threatens the potential of learning analytics to improve the quality of teaching and learning practice. This is consistent with literature on learning theory, which stresses the importance of contextual factors when considering academic risk.

The authors suggest that the under-explored role of contextual variables may help explain the mixed findings in the learning analytics field. To date, even large-scale studies have reported differences in the overall predictive power of the same variables associated with activity in learning management systems (LMS).

The authors conclude that it is crucial for learning analytics research to take into account the diverse ways in which technology is adopted and applied in course-specific contexts. A lack of attention to instructional conditions can result in over- or under-estimation of the effects of LMS features on academic success.

The authors suggest the use of learning analytics studies designed with clear theoretical frameworks that will a) connect learning analytics research with decades of previous research in education and b) make clear what is contended by research designs, and so make explicit what the research outcomes mean in relation to existing models and previous findings.

In relation to the propositions of this Evidence Hub, the paper notes that, “Learning analytics will not be of practical value or widely adopted if it cannot offer insights that are useful for both learners and teachers”.

Citation: Gašević, D., Dawson, S., Rogers, T., & Gasevic, D. (2016). Learning analytics should not promote one size fits all: The effects of instructional conditions in predicting academic success. The Internet and Higher Education, 28, 68-84. | Url: https://www.researchgate.net/profile/Dragan_Gasevic/publications?pubType=article

Type: Evidence | Proposition: D: Ethics | Polarity: | Sector: | Country:

This paper provides an overview of privacy and considers the potential contribution contemporary privacy theories can make to learning analytics. It is written from the position that having the technical capability to conduct a particular learning analytics task does not automatically mean that the task should be performed. It reflects on how privacy theories can help advance learning analytics and stresses the importance of hearing the student voice in this space.

“Transmission principles regarding the provision of a student’s personal demographic data in the student application, admission, and administration context do not necessarily apply in any other context. If a student agrees to the flow of student equity‐related data to support admission processes, he or she is not necessarily agreeing to the same terms and conditions of information flow in another context, such as secondary use of data for learning analytics activities.”

This paper is considered to be positive evidence for ‘Proposition D: Learning analytics are used in an ethical way’ because it provides evidence that learning analytics researchers are thinking carefully about these issues.

Citation: Heath, Jennifer. (2014). Contemporary privacy theory contributions to learning analytics. Journal of Learning Analytics, 1(1), 140-149. | Url: http://epress.lib.uts.edu.au/journals/index.php/JLA/issue/view/307

Type: Evidence | Proposition: A: Learning | Polarity: | Sector: | Country:

Evidence of the Month: April 2015

This paper reports findings from a study of MOOC log file data relating to a large University of Melbourne MOOC that ran in 2013. The study investigated a series of hypotheses: (i) there is skill involved in using forums to learn (ii) MOOC participants use forums differently as they progress from novice to expert in this skill (iii) this progression is reflected in log file data (iv) log file data can be used to measure a learner's skill in learning through forums. The study provides provisional support for each of these hypotheses.

The paper also sets out how skill in the use of forums for learners can develop. The proposed framework identifies how learners at each level are likely to view knowledge, how they are likely to view forums in the context of learning, and how they are likely to use MOOC forums. The framework has five levels:

  • Level 1: Novice, dependent learning
  • Level 2: Beginning, independent learner
  • Level 3, Proficient, collegial learner
  • Level 4, Competent , collaborative learner
  • Level 5 Expert, learning leader.

For example, at Level 1: 'Learning is about consuming stable knowledge in a domain, comprised mainly of cognitive understanding or skill; seeks efficient transfer from authoritative or reputable sources; follows procedural guidance from teachers, relinquishing responsibility for learning process; calibrates performance on formal assessments and accepts standards inherent in them. Forums are essentially social adjuncts to courses; the information is  possibly unreliable and misleading. Never visits forums.'

The relationship between achievement levels on the course studied and level on this scale was apparently strong. Of those rated at expert level 5, 78% received at least a pass score and 67% a distinction score. However, of those rated at level 1, less than 1% received a pass score.

Citation: Milligan, Sandra. (2015). Crowd-sourced learning in MOOCs: learning analytics meets measurement theory. Paper presented at the Learning Analytics and Knowledge (LAK15), Poughkeespie, NY, USA. | Url: http://dl.acm.org/citation.cfm?id=2723596