Archives

Type: Evidence | Proposition: D: Ethics | Polarity: | Sector: | Country:

The paper defines privacy as the regulation of how personal digital information is being observed by the self or distributed to other observers. It defines ethics as the systematization of correct and incorrect behaviour in virtual spaces according to all stakeholders.

Principles identified within this paper are:

  • transparency
  • student control over data
  • rights of access
  • accountability and assessment

The authors argue that, by discussing the various aspects within each of these categories, institutions have mechanisms to assess their initiatives and achieve compliance with current laws and regulations as well as with socially derived requirements.

In terms of the Evidence Hub proposition 'Learning analytics are used in an ethical way', the need for a paper such as this to be written implies that appropriate measures are being developed but are not yet in place. As the authors note, 'The ethical and privacy issues derived from these scenarios are not properly addressed.'

Summary provided within the paper:

What is already known about this topic
• Learning analytics offers the possibility of collecting detailed information about how students learn.
• The ethical and privacy issues derived from these scenarios are not properly addressed.
• There is a need to clarify how these issues must be addressed from the early stages of the deployment of a learning analytics scenario.
What this paper adds
• An account of how the main legislations are advancing in the general area of privacy.
• A comparison of how other disciplines such as medicine have dealt with privacy issues when collecting private information.
• The description of a group of practical principles in which to include all the ethical and privacy-related issues present when deploying a learning analytics application.
Implications for practice and/or policy
• Designers may now take into account these principles to guide the implementation of learning analytics platforms.
• Students are now aware of the different categories of issues that must be addressed by applications collecting data while they learn.
• Instructors now have a more detailed account of the issues to address when adopting learning analytics techniques.

Citation: Pardo, A., & Siemens, G. (2014). Ethical and privacy principles for learning analytics. British Journal of Educational Technology, 45(3), 438-450. | Url: https://www.researchgate.net/profile/Abelardo_Pardo/publications?pubType=article

Type: Evidence | Proposition: C: Uptake | Polarity: | Sector: | Country:

This study examined the extent to which instructional conditions influence the prediction of academic success in nine undergraduate courses offered in a blended learning model (n=4134).

The results of the study indicate that using generalized models to predict academic success prediction threatens the potential of learning analytics to improve the quality of teaching and learning practice. This is consistent with literature on learning theory, which stresses the importance of contextual factors when considering academic risk.

The authors suggest that the under-explored role of contextual variables may help explain the mixed findings in the learning analytics field. To date, even large-scale studies have reported differences in the overall predictive power of the same variables associated with activity in learning management systems (LMS).

The authors conclude that it is crucial for learning analytics research to take into account the diverse ways in which technology is adopted and applied in course-specific contexts. A lack of attention to instructional conditions can result in over- or under-estimation of the effects of LMS features on academic success.

The authors suggest the use of learning analytics studies designed with clear theoretical frameworks that will a) connect learning analytics research with decades of previous research in education and b) make clear what is contended by research designs, and so make explicit what the research outcomes mean in relation to existing models and previous findings.

In relation to the propositions of this Evidence Hub, the paper notes that, “Learning analytics will not be of practical value or widely adopted if it cannot offer insights that are useful for both learners and teachers”.

Citation: Gašević, D., Dawson, S., Rogers, T., & Gasevic, D. (2016). Learning analytics should not promote one size fits all: The effects of instructional conditions in predicting academic success. The Internet and Higher Education, 28, 68-84. | Url: https://www.researchgate.net/profile/Dragan_Gasevic/publications?pubType=article

Type: Evidence | Proposition: D: Ethics | Polarity: | Sector: | Country:

The UK's Department of Education (DfE) has been using big data to suggest that becoming an academy (a type of school currently favoured by the government) improves pupils' results on the national standardised tests in English and Maths (SATs).

However, the UK Statistics Authority (UKSA) has cast doubts on this analysis. Differences in results gains may not be connected with academy status. Instead, the differences appear to be part of a general trend for schools in England and Wales.  Overall, those with poor SATs results in 2012-13 have closed the gap on those that previously had better results.

Non-academies that started with the same – generally poor – test results as sponsored academies in 2013 actually registered faster improvements in 2014.

The UKSA said that the DfE should in future state that the data as presented could not be used – by politicians or by others – to imply a causal link between academy status and improvements in test results.

Citation: The Guardian, 14 July 2015, 'Department for Education rapped over use of Sats data' | Url: http://www.theguardian.com/education/2015/jul/14/department-for-education-sats-data-academies-uksa-performance-data

Type: Evidence | Proposition: A: Learning | Polarity: | Sector: | Country:

There is no proven direct effect between the availability of this form of learning analytics and learning outcomes. Leeuwen, A. van,  Janssen, J., Erkens, G. (2014) Effecten van learning analytics bij computer ondersteund samenwerkend leren, Kennisnet, 4W Issue 4-2014

4W Issue 4-2014

When students work together in a computer-assisted learning environment, it is difficult to get a good picture of the progress of the groups. Learning analytics, where data on the learning activities of students are displayed graphically, helps teachers to better guide the groups and make students aware of the cooperation process. The expectation is that insight into the course of the cooperation process contributes not only to better cooperation, but also results in better educational results.

It is of all times: in a group assignment one of the group members is taking the task not serious and as a result a part of the assignment stagnates and / or more work ends up on the shoulders of others. As a teacher you hear this often afterwards, when the assignment has already been submitted. In group assignments in a computer-assisted learning environment these kind of obstacles already should have come to light in an earlier stage as the teacher can watch the cooperation process, thanks to the information gathered by the computer system on activities of students. In practice, however, the amount of information collected is so large that teachers cannot see the trees from the forest. By visualizing relevant data, the information is much more manageable: students see from each other how much everyone contributes and the teacher understands the process. This graphical representation of student activities is a form of learning analytics.

Effect of learning analytics on guidance by the teacher

Teachers can get feed back from the system on the activities of the students in the form of visualizations as well. Can they can better assist students with this information (arrow 3 -> 4 in Figure 1)? Our research on the Participation Tool (Van Leeuwen et al.) shows that teachers, thanks to information of the individual contributions of group members, can better detect imbalances in the groups and act on them. Then they seem to send relatively more messages to groups where participation is problematic then to groups in which this is not the case. In addition, their assessment of the participation is more specific because they can accurately identify which group member does not participate proportionally.

In a study completed at the time, teachers were given information about which task-relevant key words students use in their discussions (Draft Trail Tool) and to which extent they made progress on the task. The result was that teachers in general were sending more messages, and again often to groups with task problems, for instance, when there was discussions about things other than the subject of the task.

The presence of learning analytics not only seems to assist teachers in keeping an eye on the problems, but also to activate them to provide help to groups with problems.

Effect of learning analytics on learning outcomes

In summary, we can conclude that learning analytics have beneficial effects on the course of collaboration. Understanding participation ensures that the attitude of students changes and becomes more active, as well as the guidance by the teachers. Reasoning back, it was expected that this would also lead to better learning outcomes (arrow 5 in Figure 1). Is that the case? No, this expectation is to date not supported by research: there is no proven direct effect between the availability of this form of learning analytics and learning outcomes. All students have a better understanding of the participation of their group members, and although teachers can guide the groups better on the basis of learning analytics, it does not provide better group outcomes, nor a better individual performance. Therewith the added value of learning analytics is demonstrable for the cooperation process, but not for the learning outcomes.

Citation: Leeuwen, A. van, Janssen, J., Erkens, G. (2014) Effecten van learning analytics bij computer ondersteund samenwerkend leren, Kennisnet, 4W Issue 4-2014 | Url: http://4w.kennisnet.nl/artikelen/2014/12/10/effecten-van-learning-analytics-bij-computer-onder/

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

Causes of stagnation or even growth in the development of pupils are not in-depth researched, resulting in the effect that no appropriate plans can be developed or implemented.

Introduction

In general primary schools have one or more digital systems with which they can register and monitor learning performance of students. The Ministry of Education, Culture and Science credited an important function to these student management systems. Conducting tests and analyzing the learning performance should be an important tool to improve education and improve learning outcomes (Ministry of Education, Culture and Science, 2007). Research by the Inspectorate of Education shows that schools who analyze their learning performance, and then adjust their teaching based on the results of these analyzes, indeed achieve higher output (Inspectorate of Education, 2010). Furthermore, in international literature the positive influence of using and analyzing student performance on the learning outcomes of students is supported (Feldman & Tung, 2001; Johnson, 2002).

To increase output, it is important that schools and school boards use the student management systems and correctly interpret the outcomes of analyzes, only then the results of analyzes can be used to improve education and output.

Various researchers, however, show that schools do not optimally use the possibilities of these digital student management systems (Meijer, Ledoux & Elshof, 2011; Visscher, Peters & Staman, 2010). Therefore, the purpose of this report is to provide information about the possibilities of digital student management systems, and on how the use of a digital student management system can lead to higher learning outcomes. Teachers, school boards and educational administrators can use the contents of this publication to assess and improve their own student management system (for this purpose included in the annexes is an instrument by means of which schools can determine their level of student management systems).

This report (in Dutch) also discusses and explains the following points:

  • why and how a student management system within a approach can result in better learning outcomes,
  • The specific support that digital student management systems can provide within result-oriented working (Hd. 2)
  • the current use student management systems at schools or school boards, and the conditions for student management systems (Hd. 3)
  • the various stages of use of student management systems (Hd. 4)
  • and finally, three inspiring examples of student management systems are described (Hd.5).

In compiling this report use has been made of the knowledge and experience from the Focus project of the University of Twente. In this project, 150 primary schools are trained in result-oriented working; the use of student management systems plays an important role.

Conclusion

This report describes what results-oriented working entails and how student monitoring systems can be supportive. Schools and school boards work result-oriented when they work systematically and purposefully with the aim of maximizing student achievement (Inspectorate of Education, 2010).

Teachers, schools and school boards need feedback on their actions and performance as a basis to monitor and improve their education and performance. The LVS (student management system) provides schools and school boards with important feedback. With LVS, and the associated Cito LVS tests they can receive feedback on the performance and development of individual students, groups of students and schools. To examine in-depth the causes of a stagnation, or even growth in the development of students, it is necessary to obtain other information with other instruments. This can, among others, be done by means of the analysis of the method-dependent tests, diagnostic interviews with students, but also with the aid of class observations.

However, research shows that schools are struggling with this last point. Causes of stagnation or even growth in the development of pupils are not in-depth researched, resulting in the effect that no appropriate plans can be developed or implemented.

If this is the case, then the LVS is only used as a measuring and viewing instrument, and not as a tool to improve educational practices. Often, a first awareness is needed to see that assessment data (also) say something about the quality of education, and that these data represent an important starting point for customizing and improving education.

In the report six stages of LVS have been formulated. These stages emerge from continued experiences within the Focus project, the steps are not supported empirically. The three practical examples indicate, however, that the structure of the pyramid can be retrieved within schools, and then in particular the distinction between the measurement and check phase and the phase where use leads to changes in teaching practice. In addition, these practical examples show that there is a tension between stage five and stage six. In stage five the school administrators use the LVS, amongst others, as an instrument with which they keep an eye on the performance of teachers, if this is not done properly it can lead to a blame culture.

When this is the case, the sixth stage of self-reflection, in which teachers from their own initiative systematically use performance feedback for their professional development, will not be reached. Furthermore, the use of LVS within a result-oriented approach can then result in undesirable side effects. In this way the focus on the learning outcomes can result in an increased pressure on students, teachers, schools and school boards which results in teaching to the test. Even fraud with assessment data is not inconceivable. Teachers and administrators must therefore be alert on the occurrence of such negative effects. Thus, it is important that a school culture is created in which teachers themselves can openly and safely discuss the results of their work, not as a basis for controlling the use of each other's actions, but to improve teaching quality based on quality feedback.

If we know better what (different) students need for their development and can better connect to these needs, the outcome will be that students also will learn more, which is ultimately what matters in result-oriented working.

Citation: Faber, M., Geel, M. van, Visscher, A. (2013) Digitale Leerlingvolgsystemen als basis voor Opbrengstgericht werken in het Primair Onderwijs, Universiteit Twente | Url: http://www.kennisnet.nl/uploads/tx_kncontentelements/Kennisnet_onderzoeksanalyse_LVS-gebruik_voor_OGW.pdf

Type: Evidence | Proposition: D: Ethics | Polarity: | Sector: | Country:

Institutional policy frameworks should provide not only an enabling environment for the optimal and ethical harvesting and use of data, but also clarify: who benefits and under what conditions, establish conditions for consent and the deidentification of data, and address issues of vulnerability and harm. A directed content analysis of the policy frameworks of two large distance education institutions shows that current policy frameworks do not facilitate the provision of an enabling environment for learning analytics to fulfil its promise.

Citation: Prinsloo, P, & Slade, Sharon. (2013, 8-12 April). An evaluation of policy frameworks for addressing ethical considerations in learning analytics. Paper presented at the Third International Learning Analytics & Knowledge Conference (LAK13), Leuven, Belgium.

Type: Evidence | Proposition: C: Uptake | Polarity: | Sector:

The authors were invited to undertake a current state analysis of enterprise learning management system (LMS) use in a large research-intensive university, to provide data to inform and guide an LMS review and strategic planning process. Using a new e-learning analytics platform, combined with data visualization and participant observation, they prepared a detailed snapshot of current LMS use patterns and trends and their relationship to student learning outcomes. This paper
discusses the reality that the institutional planning process was nonetheless dominated by technical concerns, and made little use of the intelligence revealed by the analytics process. To explain this phenomenon, the authors consider theories of change management and resistance to innovation, and argue that to have meaningful impact, learning analytics proponents must also delve into the socio-technical sphere to ensure that learning analytics data are presented to those involved in strategic institutional planning in ways that have the power to motivate organizational adoption and cultural change.

Citation: Macfadyen, Leah P, & Dawson, Shane. (2012). Numbers Are Not Enough. Why e-Learning Analytics Failed To Inform an Institutional Strategic Plan Educational Technology & Society, 15(3), 149-163.

Type: Evidence | Proposition: D: Ethics | Polarity: | Sector: | Country:

Protests began in earnest when it was discovered that inBloom’s software had more than 400 optional data fields that schools could fill out—asking for potentially sensitive information such as the nature of family relationships, learning disabilities, and even Social Security numbers. Although there were no reported leaks, parents were uncomfortable without an absolute guarantee of that data’s safety or a clear indication of who could access it. (Slate future tense, 24 April 2014) 

Type: Evidence | Proposition: A: Learning | Polarity: | Sector: | Country:

Abstract: In this paper, we explain the design research process that we used to develop the learning analytics for the a fractions intervention program called HALF. In particular, we highlight a set of qualitative interviews that we conducted with individual students after a short study in which students in three classes at the same school learned to use the virtual manipulatives embedded in the technology-enhanced learning environment (TELE) to compare pairs of proper fractions and order groups of 3 proper fractions. During the intervention, the analytics we collected for each of these 7 students indicated that they failed to master the content taught during the intervention, but they provided little insight into why and how the students were struggling. In contrast, the qualitative interviews provided us with considerable information that helped us diagnose and address misconceptions and incomplete understandings. These insights led to design changes for the lessons students who use the most up-to-date version of HALF experience via the TELE, which in turn improved the learning analytics for our system.

Overview from the Hub: This paper by Mendiburo et al. discusses the value that qualitative research (in this case student interviews) can offer to improving the interpretation of data analytics and the effective use of these analytics in an intervention programme teaching fractions. Drawing from 7 student interviews undertaken as part of their study, the highlight that learning analytics data gives a misleading impression about what, and how much, has been learnt. They explain, ‘the quantitative data that we collected about these students …yielded mostly inconclusive or misleading results while the qualitative data we collected provide particularly informative insights that led to design changes in future versions of the system.’ The case studies provided reveal how fragile the relationship can be between a learning analytic (data) and the correct interpretation of that data. In their report, the authors conclude, “once we better understood these students' process, we were able to redesign the interaction students had with the TELE in ways that provided more useful quantitative data. Because the meaning of interaction traces is defined by the designs of the interactions, refinement of analytics and user interaction with a TELE go hand in hand.”

 

Citation: Mendiburo, M., Sulcer, B. and Hasselbring, T. (2014) Interaction Design for Improved Analytics. The 4th International Conference on Learning Analytics and Knowledge. March 24-28, 2014. Indianapolis, USA. http://lak14indy.wordpress.com/lak-2014-conference-program-schedule/