Learning analytics improve learning outcomes.
Learning is at the heart of learning analytics. Do we see real improvements in learning outcomes for learners? We might be able to see patterns in learners’ data, but can we take action based on those patterns that improves their learning? We might be able to personalise learning based on learners’ data, but does that make any difference to how much they learn?
This proposition is about improved learning outcomes, including cognitive gains, improved assessment marks, better scores on tests and attainment results.
Example positive evidence: A study showing measurable improvements in scores on a test among learners who received study prompts from a learning analytics system compared to the usual teaching approach.
Example negative evidence: A study showing no significant difference in assessment results before and after the introduction of a learning analytics dashboard.
[Classification note: Evidence about making learning more efficient, or about improving retention, completion and progression, should be related to Proposition B – Teaching.]
Learning analytics improve learning support and teaching, including retention, completion and progression.
Do learning analytics optimise the learning process? We would expect them to lead to more efficient processes, allow resources to be better targeted, and save money and time. We would also expect performance metrics (other than attainment) to improve. Do learning analytics lead to improvements in retention, completion and progression?
This proposition is about improvements to teaching and learning that are not direct learning gains by the learner.
Example positive evidence: A study showing improved retention among student cohorts whose tutors were prompted to contact at-risk learners identified by a predictive model compared to other cohorts.
Example negative evidence: A learning analytics project that increased costs but resulted in no improvements in efficiency or performance.
[Classification note: Evidence about learning gains, or improved assessment scores, should be related to Proposition A – Learning.]
Learning analytics are taken up and used widely, including deployment at scale.
Are learning analytics a fad that will never really get off the ground at real scale? Will they ever move beyond pilot projects and demos? If a system is deployed across a whole organisation, do the teachers and learners actually use it?
This proposition focuses on the level of usage of learning analytics, and is concerned with institutional and policy perspectives.
Example positive evidence: A survey of usage of learning analytics dashboards across the university sector in one country finds that they are used in more than half of organisations.
Example negative evidence: A predictive modelling project is rolled out across a group of schools, but usage is low and the project is discontinued.
Learning analytics are used in an ethical way.
Learning analytics raise many ethical issues, including those related to privacy, transparency, surveillance, data ownership and control, and data protection. Can these real concerns be addressed effectively, or will they prove to be barriers?
This proposition is about the ‘should we’ questions, rather than the ‘can we’ ones addressed by the other propositions.
Example positive evidence: An organisation develops an ethics policy for learning analytics that is warmly received by learners and other stakeholders.
Example negative evidence: A large-scale project to gather analytics data across multiple schools is shut down because of concerns about privacy.