Our Learning Analytics are Our Pedagogy, Are They? (#xAPI, #dalmooc)
In this paper “Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing Pedagogical Action: Aligning Learning Analytics With Learning Design. American Behavioral Scientist”, it claims that learning design can be used “as a framework for design of analytics to support faculty in their learning and teaching decisions”.
But given the current nature of the tools available, just how realistic is this? Responded David Jones at University of Southern Queensland (in this post “Aligning learning analytics with learning design“). That’s a great question! A learning design might encompass a whole range of tools, expecting a LMS to provide pre-installed analytic is only meeting part of the needs at best. One reason is learning isn’t just happening inside a box, the other reason is how the probing, measurement, reporting and analytics offered by a tool might not be able to meet a range of pedagogies and hence assessment needs. Because … “Learning design establishes the objectives and pedagogical plans, which can then be evaluated against the outcomes captured through learning analytics, if the analytics are available“.
In this open access publication: “Epistemology, Assessment, Pedagogy: Where Learning Meets Analytics in the Middle Space“(from Open University UK), the Epistemology–Assessment–Pedagogy triad was examined.
We propose that the literature examining the triadic relationships between epistemology (the nature of knowledge), pedagogy (the nature of learning and teaching) and assessment provide critical considerations for bounding this middle space. We provide examples to illustrate the ways in which the understandings of particular analytics are informed by this triad. As a detailed worked example of how one might design analytics to scaffold a specific form of higher order learning, we focus on the construct of epistemic beliefs: beliefs about the nature of knowledge. We argue that analytics grounded in a pragmatic, sociocultural perspective are well placed to explore this construct using discourse-centric technologies.
Buckingham Shum (2012) used the shorthand “our learning analytics are our pedagogy” : the types of analytic we chose to deploy, and the ways in which we deploy them implicate particular approaches to learning and assessment. The relationship between learning analytics and pedagogy is important because they are both bound up in epistemology – what knowledge is. The relationship between a number of established pedagogic approaches and learning analytics are introduced.
Transactional or instructionalist approach
Analytics Implications: learning analytics based on transactional approaches will tend to focus on simple metrics such as test scores, not requiring deeper analysis of more complex artefacts, or the processes by which they were derived.
Analytics Implications: learning analytics with a constructivist focus will focus on progress, particularly through tracking and judging the modifications made to a set of materials, resources or tools selected and arranged by the educator. An example of analytics in this tradition would be tracking the evolution of digital artefacts within the Scratch visual programming environment and community (Maloney, et al., 2010).
Subjectivist or affect based approach
Analytics Implications: In tandem with other approaches, learning analytics based on subjectivist approaches are likely to provide motivation assessments for understanding why someone is (or is not) engaging in particular actions. Such analytics may focus on self-report through survey tools (Buckingham Shum and Deakin Crick, 2012) or affect-based semantic mark-up such as blog tagging (R. Ferguson, Buckingham Shum & Deakin Crick, 2011), alongside automated approaches such as textual sentiment analysis.
Analytics Implications: Analytics based on apprenticeship approaches are likely to focus on classifying expert and novice users, and the shift from novice to expert. Such analysis may explore behavioural markers which mirror those made by ‘experts’, but may not explore the reasons or meanings implicated in such moves. Epistemic Network Analysis of user data from gaming environments is designed to quantify the degree to which learners demonstrate behaviours valued in a professional community (Shaffer, et al. 2009). The creation of social capital might be considered another proxy for community membership, overlapping with the next category.
Analytics Implications: Connectivist approaches use network analysis to explore the ‘connectedness’ of a learner’s knowledge –in terms of both concepts, and social connections. Analytics would look at how networks’ size, quality and changes over time can serve as proxies for effective learning (Dawson, 2010; Haythornthwaite and De Laat, 2010).
Pragmatic, sociocultural approach
Analytics Implications: Pragmatic approaches have traditionally focused less on assessing the products of learning (except where they are being used for something), and more on the process. Analytics tools in sociocultural approaches encourage learners to reflect on their own activity, in an attempt to understand how they can develop their skills in information processing, in their own particular contexts. Analytics within this approach might attend particularly to quality of discourse for learning, for creating a mutuality of perspectives (Edwards & Mercer, 1987) including in collaborative information seeking tasks (Foster, 2009; Hertzum, 2008; Lazonder, 2005). Our previous work is in this tradition, drawing on sociocultural discourse analysis (Mercer & Littleton, 2007), and emerging conceptions of the pragmatic web (Buckingham Shum, 2006). This research foregrounds how students interact with information, make sense of it in their context and co-construct meaning in shared contexts. These are on-going processes which highlight the question of how learning analytics fits into the context of AfL(Assessment for Learning) and pedagogy.
There is a strong relationship between analytics, assessment, pedagogy, and epistemology ; learning analytics should be mindful of this triad, which sociocultural analytics bridges well.
Standardized assessments, just as psychometric tests, are good at population level analytics. However, they are problematic at individual level analysis, at understanding change, and at understanding individual’s traits and capabilities. The perspective we have outlined in this paper suggests analytics may play an important role in assessment for learning, providing formative, individualised, feedback to learners and their teachers.
Learning design as a framework for design of analytics
A learning design can be used as a framework for design of analytics to support faculty in their learning and teaching decisions. David pointed to learning design repositories (e.g. http://www.learningdesigns.uow.edu.au and http://126.96.36.199:42042/ODC.html). (Most based on constructivist assumptions.)
Essential elements in learning design:
- Key actors
- What they are expected to do
- What resources are used
- sequence of activities
Back to the tool problem, educators might possibly hold to account with analytics-based performance indicators set by learning systems, may come to design learning in a way which will enable them to evidence impact with the tools, like “accounting tools…do not simply aid the measurement of economic activity, they shape the reality they measure.” (Gay and Pryke, 2002) Aligning the learning anaytics to the learning design (pedagogy), not the other way around, is the main point this article likes to emphasize here. This connects to a basic, simple argument we believe: analytics should be question-driven, not data-driven. (questions first, data next)
xAPI enables us to answer questions we could never answer before
Assessments for learning across learning systems and venues, and continuous feedback loops or immediate interventions, has become possible because of new ways to track learning records, among them the new learning standard xAPI (Tin Can API) is a major breakthrough. Designing learning activities across different resources, tools and systems, and tracking all (granular) learning events that matter to the pedagogy and learning itself are what xAPI enables us. In short, xAPI is a data plumbing technique with a standard learning language.(yes, it’s semantic) The language (spec.) isn’t hard to understand, but its full potential might be hard to be fulfilled. It depends on the questions we asked, questions will reveal problems, needs, insights and assumptions.
Although learning analytics(LA) isn’t a new research field, but the development is constrained by the availability of data. For example, most analyses or modeling are developed based on log data in the learning systems(learning in a box). Now by embedding xAPI scripts in different software (different programming languages), JSON data in property-value pairs will be collected. Suddenly you can probe anything, what do you want to probe? xAPI enables us to answer questions we could never answer before, so now just ask questions!(about learners and about learning design, these two can’t be considered separately) Leveraging the techniques of xAPI and LA, that’s what we are working for now. By the way, we are following #dalmooc now, it’s an inspiring experience that always reminds us back to thinking of learning itself and then asking relevant questions.
Note: Just an example, analytics of watching videos is a popular research topic because of the popularity of MOOCs. Here is the questions we ask about watching videos. But keep in mind that watching videos is only a small part of a learning design plan.
Knight, Simon; Buckingham Shum, Simon and Littleton, Karen (2014). Epistemology, assessment, ped- agogy: where learning meets analytics in the middle space. Journal of Learning Analytics (In press).