Knowledge acquisition for learning analytics: Comparing teacher-derived, algorithm-derived, and hybrid models in the moodle engagement analytics plugin

Publisher:
Springer Nature
Publication Type:
Conference Proceeding
Citation:
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2016, 9806 LNCS, pp. 183-197
Issue Date:
2016-01-01
Filename Description Size
978-3-319-42706-5_14.pdfPublished version808.02 kB
Adobe PDF
Full metadata record
One of the promises of big data in higher education (learning analytics) is being able to accurately identify and assist students who may not be engaging as expected. These expectations, distilled into parameters for learning analytics tools, can be determined by human teacher experts or by algorithms themselves. However, there has been little work done to compare the power of knowledge models acquired from teachers and from algorithms. In the context of an open source learning analytics tool, the Moodle Engagement Analytics Plugin, we examined the ability of teacher-derived models to accurately predict student engagement and performance, compared to models derived from algorithms, as well as hybrid models. Our preliminary findings, reported here, provided evidence for the fallibility and strength of teacher-and algorithm-derived models, respectively, and highlighted the benefits of a hybrid approach to model-and knowledge-generation for learning analytics. A human in the loop solution is therefore suggested as a possible optimal approach.
Please use this identifier to cite or link to this item: