#xAPI Profile, Implementation and Reporting Design
About learning and teaching, everything works, but how well? John Hattie has led a team at Auckland University, New Zealand which compares the effect on learning of over 100 classroom interventions. In his book, John Hattie said that almost everything we do in our efforts to help students in schools has a positive effect on students. However, much of what we do actually isn’t effective enough … (read more about the findings). The major reason is that there is no data-driven iterations for improvement and personalization.
Universal Design for Learning (UDL) aims to teach every students by differrentiated learning designs. Individuals bring a huge variety of skills, needs, and interests to learning. Neuroscience reveals that these differences are as varied and unique as our DNA or fingerprints. This is to argue about the significance of instructional and learning strategies since humans are so complex. Take a popular term “blended learning”, in this old post, 6 learning models that work are identified for blended learning.
When you are planning xAPI implementation, the profile design, learning design and reporting design should be considered at the same time.
Our Analytics Should be Our Pedagogy
A learning design SHOULD be used as a framework for design of analytics to support faculty in their learning and teaching decisions. Read more discussions about this on Our Learning Analytics are Our Pedagogy, Are They? Seeing through the lens of educators and learning designers, they will be looking for what works(strategies, incentives, media types, group members…) for which learners and which situations. The reporting design must be able to help them find the learning/pedagogy patterns and corresponding metrics. Also, different learning theories will seek different learning/behavior patterns as valuable.
The concept of learning object and re-usability have been invented and advocated for a long time. Obviously, instructionally grounded sequencing decisions are at the heart of the instructionally successful use of learning objects. One of the most crucial information in xAPI records are context and metrics for educators to make judgement. xAPI can help us answer many complex questions that couldn’t be answered before.
Considerations of designing xAPI statements and profile include:
Context => identify patterns (essential elements in learning design)
- Key actors
- What they are expected to do
- What resources are used
- sequence of activities
Metrics => identify performance desired ( related with learning objectives, evidences for assessment, rubrics)
Questions to answer
- Learner modeling
- Intervention actions
- Refinements and iterations needed to be planned
Better Learning Interface Leads to Better Learning Analytics
There are contexts and metrics in these 3 conceptual levels (not necessarily 3 levels, maybe more, maybe less), it’s to show the hierarchy of analysis layers. If we design with purposes for analytics, hopefully we can correlate what works for the performances or outcomes desired. XAPI is built to give the total freedom of evidence-based design to innovative learning designers, not to comply with legacy practices.
Even at the bottom level, how a video is designed will make a difference on what we can extract from the learning records. That’s what we call “Learning Interface”, a term twisted from User Interface (UI), but it means all design factors that influence how we learn like user interface decides how users use. Well-designed learning interface gives learners better learning flow, meaningful interactions and study aids. Moreover, deeper insights can be obtained through analyzing learning experiences. Keep in mind that all these efforts are to help learners, so improving learning interface design should be leading all development efforts.
For example, in John Seely Brown’s talk on 1999, he described how they experimented with videos:
We now have a prototype system for doing this designed by Dan Russell’s group at PARC. First, we capture and store the digital video on a media server, which also marks and timestamps any time there is any uniquely identifiable event such as clapping, laughing, a slide change (the latter being identifiable by shifts in the color space). The audience can also use their laptops or Palm Pilots to take notes, notes that can be time-stamped and thus cross-indexed into the video stream. We also transcribe the audio stream. That’s actually done by a transcription service that costs surprisingly little. All these “signals” are combined to make a soup of streams all cross-indexed with each other. The resulting structure becomes a very rich medium in which you can skim and pick out exciting moments where, for example, there was wild applause, laughter, heated argumentation, etc. From this structure you can tell when the energy in the room went up or when you or a colleague you know made an annotation and what kind of annotation it was.
This is a first stab at trying to find ways to capture and represent additional signals, signals that are created almost as a by-product of an audience listening to a presentation and then using these signals as structural indices to the video stream. The goal is to make this a richer knowledge asset than just the video so that browsing, reflection and focused conversations are more likely to happen. Also note that if you have a diverse set of people taking notes and who are willing to identify themselves, you start to create an ecology of annotations—diverse, overlapping, richly opinionated, etc.
With this structured video watching experiences, a lot more meaningful analytic results can be obtained from data. (related reading: Making Sense of Video-Based Learning Analytics, How to Improve Engagement and Impact of Video-Based Learning)
So, the conclusion is that xAPI implementation design and reporting design should work with educators to co-design the solutions to answer educators’ questions. Since learning design or lesson plan is the key, but learning experiences could be distributed, what educators need the most is a learning design tool to orchestra the activities across systems and tools. At the same time, this learning design tool can serve to coordinate the xAPI records into the patterns easily readable for educators to take actions. Revealing and visualizing the whole patterns from distributed learning experiences is the true value of xAPI.
Small actionable data are better than big data. We are co-designing this kind of tool with educators, if you are educators or learning designers, you are welcome to give us some thoughts.