Computerized educational systems are increasingly provided as open online services which provide adaptive personalized learning experience. To fully exploit potential of such systems, it is necessary to thoroughly evaluate different design choices. However, both openness and adaptivity make proper evaluation difficult. We provide a detailed report on evaluation of an online system for adaptive practice of geography, and use this case study to highlight methodological issues with evaluation of open online learning systems, particularly attrition bias. To facilitate evaluation of learning, we propose to use randomized reference questions. We illustrate application of survival analysis and learning curves for declarative knowledge. The result provide an interesting insight into the impact of adaptivity on learner behaviour and learning.
: Jan Papoušek, Vít Stanislav and Radek Pelánek (2016). "Evaluation of an Adaptive Practice System for Learning Geography Facts". In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York.