Using A/B Testing in MOOC Environments

Type: Evidence | Proposition: B: Teaching | Polarity: | Sector: | Country:

In recent years, Massive Open Online Courses (MOOCs) have become a phenomenon offering the possibility to teach thousands of participants simultaneously. In the same time the platforms used to deliver these courses are still in their fledgling stages. While course content and didactics of those massive courses are the primary key factors for the success of courses, still an smart platform may increase or decrease the learners experience and his learning outcome. This paper at hand proposes the usage of an A/B testing framework that is able to be used within an microservice architecture to validate hypotheses about how learners use the platform and to enable data-driven decisions about new features and settings. To evaluate this framework three new features (Onboarding Tour, Reminder Mails and a Pinboard Digest) have been identified based on a user survey. They have been implemented and introduced on two large MOOC platforms and their influence on the learners behavior have been measured. Finally this paper proposes a data driven decision workflow for the introduction of new features and settings on e-learning platforms.

Citation: Jan Renz, Daniel Hoffmann, Thomas Staubitz and Christoph Meinel (2016). "Using A/B Testing in MOOC Environments". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.