Archives

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

Little is known about the processes institutions use when discerning their readiness to implement learning analytics. This study aims to address this gap in the literature by using survey data from the beta version of the Learning Analytics Readiness Instrument (LARI) [1]. Twenty-four institutions were surveyed and 560 respondents participated. Five distinct factors were identified from a factor analysis of the results: Culture; Data Management Expertise; Data Analysis Expertise; Communication and Policy Application; and, Training. Data were analyzed using both the role of those completing the survey and the Carnegie classification of the institutions as lenses. Generally, information technology professionals and institutions classified as Research Universities-Very High research activity had significantly different scores on the identified factors. Working within a framework of organizational learning, this paper details the concept of readiness as a reflective process, as well as how the implementation and application of analytics should be done so with ethical considerations in mind. Limitations of the study, as well as next steps for research in this area, are also discussed.

Citation: Meghan Oster, Steven Lonn, Matthew Pistilli and Michael Brown (2016). "The Learning Analytics Readiness Instrument". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

The prevalence of early alert systems (EAS) at tertiary institutions is increasing. These systems are designed to assist with targeted student support in order to improve student retention. They also require considerable human and capital resources to implement, with significant costs involved. It is therefore an imperative that the systems can demonstrate quantifiable financial benefits to the institution.. The purpose of this paper is to report on the financial implications of implementing an EAS at an Australian university as a case study.. The case study institution implemented an EAS in 2011 using data generated from a data warehouse. The data set is comprised of 16,124 students enrolled between 2011 and 2013. Using a treatment effects approach, the study found that the cost of a student discontinuing was on average $4,687. Students identified by the EAS remained enrolled for longer, with the institution benefiting with approximately an additional $4,004 in revenue per student. Within the schools of the institution, all schools had a significant positive effect associated with the EAS. Finally, the EAS showed significant value to the institution regardless of when the student was identified. The results indicate that EAS had significant financial benefits to this institution and that the benefits extended to the entire institution beyond the first year of enrolment.

Citation: Scott Harrison, Rene Villano, Grace Lynch and George Chen (2016). "Measuring financial implications of an early alert system". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

In the last few years, there has been a growing interest in learning analytics (LA) in technology-enhanced learning (TEL). Generally, LA deals with the development of methods that harness educational data sets to support the learning process. Recently, the concept of open learning analytics (OLA) has received a great deal of attention from LA community, due to the growing demand for self-organized, networked, and lifelong learning opportunities. A key challenge in OLA is to follow a personalized and goal-oriented LA model that tailors the LA task to the needs and goals of multiple stakeholders. Current implementations of LA rely on a predefined set of questions and indicators. There is, however, a need to adopt a personalized LA approach that engages end users in the indicator definition process by supporting them in setting goals, posing questions, and self-defining the indicators that help them achieve their goals. In this paper, we address the challenge of personalized LA and present the conceptual, design, and implementation details of a rule-based indicator definition tool to support flexible definition and dynamic generation of indicators to meet the needs of different stakeholders with diverse goals and questions in the LA exercise.

Citation: Arham Muslim and Mohamed Amine Chatti (2016). "A Rule-Based Indicator Definition Tool for Personalized Learning Analytics". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

In this paper we discuss the results of a study of students' academic performance in first year general education courses. Using data from 566 students who received intensive academic advising as part of their enrollment in the institution's pre-major/general education program, we investigate individual student, organizational, and disciplinary factors that might predict a students' potential classification in an Early Warning System as well as factors that predict improvement and decline in their academic performance. Disciplinary course type (based on Biglan's [7] typology) was significantly related to a student's likelihood to enter below average performance classifications. Students were the most likely to enter a classification in fields like the natural science, mathematics, and engineering in comparison to humanities courses. We attribute these disparities in academic performance to disciplinary norms around teaching and assessment. In particular, the timing of assessments played a major role in students' ability to exit a classification. Implications for the design of Early Warning analytics systems as well as academic course planning in higher education are offered.

Citation: Michael Brown, R. Matthew Demonbrun, Steven Lonn, Stephen Aguilar and Stephanie Teasley (2016). "What and When: The Role of Course Type and Timing in Students' Academic Performance". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

We study how the use of online learning systems stimulates cognitive activities, by conducting an experiment with the use of eye tracking technology to monitor eye fixations of 60 final year students engaging in online interactive tutorials at the start of their Final Year Project module. Our findings show that the students' learning behaviours fall into three different types of eye fixation patterns, and the data corresponding to the different types of learners relates to the performance of the students in other related academic modules. We conclude that this method of studying eye fixation patterns can identify different types of learners with respect to their cognitive capability and academic potential, and also allow educators to understand how their instructional design and online learning environment can stimulate higher-order cognitive activities.

Citation: The Benedict & Manolis Mavrikis (2016). "A Study On Eye Fixation Patterns of Students in Higher Education Using an Online Learning System". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: A: Learning | Polarity: | Sector: | Country:

Statway is one of the Community College Pathways initiatives designed to promote students' success in their developmental math sequence and reduce the time required to earn college credit. A recent causal analysis confirmed that Statway dramatically increased students' success rates in half the time across two different cohorts. These impressive results also obtain results across the gender and race/ethnicity groups. However, there is still room for improvement. Students who did not succeed overall in Statway often did not complete the first of the two-course sequence. Therefore, the objective of this study is to formulate a series of indicators from self-report and online learning system data, alerting instructors to students' progress during the first weeks of the first course in the Statway sequence.

Citation: Ouajdi Manai, Hiroyuki Yamada and Christopher Thorn (2016). "Real-­time indicators and targeted supports: Using online platform data to accelerate student learning". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: A: Learning | Polarity: | Sector: | Country:

The 'doer effect' is an association between the number of online interactive practice activities students' do and their learning outcomes that is not only statistically reliable but has much higher positive effects than other learning resources, such as watching videos or reading text. Such an association suggests a causal interpretation--more doing yields better learning--which requires randomized experimentation to most rigorously confirm. But such experiments are expensive, and any single experiment in a particular course context does not provide rigorous evidence that the causal link will generalize to other course content. We suggest that analytics of increasingly available online learning data sets can complement experimental efforts by facilitating more widespread evaluation of the generalizability of claims about what learning methods produce better student learning outcomes. We illustrate with analytics that narrow in on a causal interpretation of the doer effect by showing that doing within a course unit predicts learning of that unit content more than doing in units before or after. We also provide generalizability evidence across four different courses involving over 12,500 students that the learning effect of doing is about six times greater than that of reading.

Citation: Kenneth R. Koedinger, Elizabeth A. McLaughlin, Julianna Zhuxin Jia and Norman L. Bier (2016). "Is the Doer Effect a Causal Relationship? How Can We Tell and Why It's Important". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: A: Learning | Polarity: | Sector: | Country:

The adequate emotional state of students has proved to be essential for favoring learning. This paper explores the possibility of obtaining students' feedback about the emotions they feel in class in order to discover potential emotion patterns that might indicate learning fails. This paper presents a visual dashboard that allows students to track their emotions and follow up on their evolution during the course. We have compiled the principal classroom related emotions and developed a two-phase inquiry process to: verify the possibility to measure students' emotions in classroom; discover how emotions can be displayed to promote self-reflection; and confirm the impact of emotions on learning performance. Our results suggest that students' emotions in class are related to evaluation marks. This shows that early information about students' emotions can be useful for teachers and students to improve classroom results and learning outcomes.

Citation: Samara Ruiz, Sven Charleer, Maite Urretavizcaya, Joris Klerkx, Isabel Fernandez-Castro and Erik Duval (2016). "Supporting learning by considering emotions: Tracking and Visualization. A case study". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

This paper explores the potential of analytics for improving accessibility of e-learning systems and for supporting disabled learners in their studies.

The work is set in the context of learning and academic analytics' focus on issues of retention. The definitions of disability and accessibility in e-learning are outlined and the implications of these for how disabled students needs may be modeled in learning analytics systems briefly discussed.

A comparative analysis of completion rates between disabled and non-disabled students in a large data set of 5 years of Open University modules is presented. The wide variation in comparative retention rates is noted and characterized. A key assertion of this paper are that learning analytics provide a set of tools for identifying and understanding such discrepancies and that analytics can be used to focus interventions that will improve the retention of disabled students in particular. A comparison of this quantitative approach with that of qualitative end of module surveys is made. How an approach called Critical Learning Paths, currently being researched, may be used to identify accessibility deficits in module components that are significantly impacting on the learning of disabled students is described.

An outline agenda for onward research currently being planned is given.

It is hoped that this paper will stimulate a wider interest in the potential benefits of learning analytics for higher educational institutions as they try to assure the accessibility of their e-learning and for the provision of support for disabled students.

Citation: Martyn Cooper, Rebecca Ferguson and Annika Wolff (2016). "What Can Analytics Contribute to Accessibility in e-Learning Systems and to Disabled Students' Learning?". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

Data-driven methods have previously been used in intelligent tutoring systems to improve student learning outcomes and predict student learning methods. We have been incorporating data-driven methods for feedback and problem selection into Deep Thought, a logic tutor where students practice constructing deductive logic proofs. These methods include data-driven hints and a data-driven mastery learning system (DDML) which calculates student proficiency based on rule scores weighted based on expert input in order to assign problem sets of appropriate difficulty.
In this latest study we have implemented our data-driven proficiency profiler (DDPP) into Deep Thought as a proof of concept. The DDPP determines student proficiency without expert involvement by comparing relevant student rule scores to previous students who behaved similarly in the tutor and successfully completed it. The results show that the DDPP did improve in performance with additional data and proved to be an effective proof of concept.

Citation: Behrooz Mostafavi and Tiffany Barnes (2016). "Data-driven Proficiency Profiling - Proof of Concept". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.