Innovating Pedagogy – Learning Design Informed by Analytics
This is the third in a series of annual reports on innovations in teaching, learning and assessment. The Innovating Pedagogy reports from Open University(UK) are intended for teachers, policy makers, academics and anyone interested in how education may change over the next ten years.
Learning design informed by analytics – A productive cycle linking design and analysis of effective learning
Potential impact: high
Timescale: medium (2–5 years)
As learning is taken online, there are opportunities to collect data on student activities and analyse these, both to inform the design of new courses and to improve the learning experience. The data can also be linked with test results to show which learning activities produce good results and to identify where learners are struggling.
These kinds of user behaviour data can be supplemented with background information from student admission systems, showing prior education, registration to courses, or number of credits obtained in a year. Easily obtained but lowquality data, such as log-ﬁles of test scores and details of which materials students have viewed, can be combined with information gathered directly through surveys of learners’ goals and motivation, to create a rich picture of the patterns and pitfalls of taking an online course.
What to measure?
There is, however, no consensus in the learning analytics community on which learner activities and survey data are appropriate to measure, nor how these can be interpreted to improve teaching and raise performance. For example, a study of 118 biology students found that the number of discussion messages posted, assessments ﬁnished, and mail messages sent were useful in predicting student performance, but others such as the overall time spent online were not. This study and others using similar methods of analysis show that, in general, students who take the opportunity for discussion with their peers, are active in engaging with course materials, and keep up with the administrative details of the course, gain higher overall grades.
While these data may indicate connections between activity and performance, they do not explain why some types of learning are successful. Nor can they predict the performance of each student by studying general patterns of activity. A recent study of students on a mathematics course at Maastricht University logged over 100 variables (such as clicks, time spent, downloads, motivation, emotions, quizzes). It found that academic performance was only poorly predicted by these basic data. Instead, richer information on performance on various assessments during the module, learning strategies, and attitudes to learning allowed the researchers better to understand why and how students interacted in this environment over time.
It is tempting for course developers to track data that is relatively easy to capture, in order to gain a broad picture of performance. However, there is a need for researchers and practitioners to ﬁnd ‘actionable data’ that can guide teachers in deciding when to offer assistance and can help course designers improve the content and structure of their courses. For example, an online course offers the opportunity for students to seek help and offer advice to their peers, but this needs to be supported by ways to judge when people are offering helpful and reliable advice. Analytics can indicate which social methods (such as rating the quality of advice or giving badges to helpful students) are most effective in helping students to ﬁnd and give advice, and so guide the development of new tools and services. The Pittsburgh DataShop provides tools and datasets from existing online courses to help with identiﬁcation of important learning activities and learning behaviours.
To make sense of the vast amounts of data that can be linked to understand learners’ journeys, Miller and Mork propose a value chain for discovery, integration, and exploitation of large-scale data. This could help organisations to align their learning design with the outcomes from learning analytics in order to improve students’ learning experience.
There are limits to the kinds of learning behaviour that institutions can analyse. As students learn at home or on the move with their own devices and tools, so their activities are increasingly beyond the reach of data loggers. Longitudinal research at a Dutch medical programme using problem-based learning showed that 80% of the students learned more from contacts outside their formal group than from their course colleagues. The informal social networks and personal tools of students have a substantial impact on their attitudes, actions and behaviour.
A potential danger of learning analytics is mislabelling students according to incomplete or incorrect information, or inaccurate algorithms. How learners behave in classrooms or online depends upon a complex interaction of personal, emotional, social, and economic factors that are not directly observable from behaviour alone. In their extreme form, learning analytics may undermine or restrict an individual’s choice of access to particular materials or resources. For example, when learning analytics are used to suggest courses and modules that increase a particular student’s chance of successfully completing a qualiﬁcation, there is a danger that students will select easier courses with a higher possibility of success rather than more appropriate, but challenging, courses. It is therefore important to involve students as active agents and collaborators. Student-oriented learning analytics, with data shown to the learners, could provide students and teachers with opportunities for self-reﬂection and the development of shared understanding.
Substantial progress has been made to use the power of learning analytics to inform and tune innovative learning designs. An important consideration for institutions wanting to implement learning analytics is their capacity to produce and act on reliable data. Organisational change takes substantial time, effort and ﬁnancial resources. We expect an increased use of learning analytics by managers and teachers to improve the quality of their courses. This, in turn, will help the learning analytics community to understand more clearly which variables for learning are important, how to incorporate informal learning, and where the ethical boundaries of learning analytics lie.
Sharples, M., Adams, A., Ferguson, R., Gaved, M., McAndrew, P., Rienties, B., Weller, M., & Whitelock, D. (2014). Innovating Pedagogy 2014: Open University Innovation Report 3. Milton Keynes: The Open University