A Learning Engineer’s Angle on Learning Analytics
October 29, 2020
Keywords:
Target Readers: Software Designer; Educator; Learning Technologist; Instructional Designer; Researcher; Leader; Learning Scientist
Author: Melanie Peffer (co-editor Nexus)
Position: Research Faculty/Instructor, University of Colorado Boulder
A Learning Engineer’s Angle on Learning Analytics: Interview with Bror Saxberg
“Learning Engineering” has been proposed as a prospective field that attempts to bridge various disciplines to create learning experiences grounded in what we know about human learning, increasing capacity to measure processes and outcomes, and examining learner needs at scales.
Bror Saxberg is the VP of Learning Science in the Education Program at Chan Zuckerberg Initiative, where he works on how to bring more evidence-based practices into the learning ecosystem as a whole.
He’s no stranger to incorporating insights from across disciplines — he holds a B.A. in mathematics, a B.S. in electrical engineering from University of Washington, an M.A. in Mathematics from Oxford University, a Ph.D. in Electrical Engineering and Computer Science from MIT in addition to an M.D. from Harvard Medical School.
Bror is well known for his work on learning engineering, particularly while a Chief Learning Officer at Kaplan, Inc. Recently, Bror answered a few questions about learning engineering. The questions and his responses are below:
Question: Learning engineering is a buzzword right now - one that has resulted in (occasionally) heated discussion. Without getting at semantics, what do you feel is the heart of learning engineering?
Answer: Words and phrases do become complicated! To me, "learning engineering" is a big tent: basically, the application of evidence-grounded insights of many different disciplines leveraged to create real-world learning solutions (for academic, non-academic, and workplace outcomes). Especially those working at scale.
So this includes results from a wide array of research domains - learning science, developmental psychology, behavioral economics, motivational psychology, social psychology, and more - as well as a wide array of methodologies - randomized controlled trials, quasi-experimental work, correlational studies, case studies, anthropological studies, and more - all combined together to help design and iterate in real world circumstances.
Learning engineering includes paying close attention to how close the subjects of the research are to the context we're working in and assessing the evidence foundation.
The key is triangulation - a case study on exactly the population we're interested in may not have enough statistical power for some publication purposes, but combined with other studies using the same principles with other populations, you can build confidence that this can be good input to a design.
The other thing that comes to mind when I consider "learning engineering" (as with any engineering) is a requirement to iterate: your humility about the first effort, even grounded in good evidence, should require you to instrument carefully to look for failure modes as you try it out, and be prepared to keep iterating for different subgroups and contexts as needed.
Question: How do you view the intersection of learning analytics and learning engineering? Said otherwise, why would a learning analytics researcher be interested in learning engineering?
Answer: The two are closely interconnected, in the way I might use the phrases. "Learning analytics" is about how to use data (possibly from multiple sources) to drive hypotheses and conclusions about what is going on, and for which subgroups. "Learning engineering" is more generally about the design and iteration of evidence-grounded learning experiences as a whole, and so benefits greatly from learning analytic expertise to see how things are going, and for whom.
Someone steeped in learning analytics could easily begin thinking about how to use those techniques with additional information to design/redesign learning environments for new contexts. An analogy (not perfect): why would a statistician become an economist? Because they want to apply a set of methodologies they are expert in within an increasingly deep context of results and additional ideas - and need additional expertise to be able to do that well.
Question: Do you think learning engineers think differently about learning analytics from other communities?
Answer: That's a tough question. I would say a learning engineer would be very engaged by how learning analytics can help them understand what is working for whom (near- and long-term) at scale in a practical learning environment, and will want to keep iterating to look for improvements for various subgroups over time.
It's possible that other researchers might engage with learning analytics for other reasons - e.g., to analyze a dataset to see if an effect is present, but for purposes of publication and further research, not necessarily at scale and longitudinally or for practical impact.
Question: We know that simply showing data to educators or students doesn’t lead to actionable insights. Can you give us any examples of effective analytics and what it was about the design process that made them work?
Answer: There's been quite a bit of research on this, and I think part of the problem is cognitive load: a busy teacher (and many other potential users of data) doesn't have the time to build expertise on their own to sort out the best actions to take based on raw data they are handed. What seems likely to work better is to process the evidence (still keeping it available) into simpler blocks and suggestions that the practitioner can more easily act on - simpler to tackle and draws on their own practice-intuitions to make final calls.
At the same time, iteration again is key: at one virtual university I was involved with, for example, it turned out that of three different types of data about whether students were engaging with a course, only one kind of data helped teachers actually improve both learning outcomes and continued engagements. That could not be predicted in advance, it just had to be tried out with good evidence-gathering.
Question: What do you wish researchers, teachers, and designers knew about learning engineering?
Answer: I wish folks beginning their efforts to design or improve a learning environment would realize there is a rich array of approaches and principles derived from decades of research that can help them get a good first draft: principles of cognitive load, multimedia theory, motivational underpinnings, the importance of identity and toxic stress, and more. It's also important to recognize that experts don't necessarily verbally have access to all their expertise anymore. Estimates from cognitive science research suggest that more than 70% of what experts decide and do is no longer conscious - it is tacit, non-conscious, whether as pattern recognition or even extended processes through time. And these days, with technology's influence, what a top expert decides and does is changing faster and faster!
That means truly designing learning to help people match what the best practitioners decide and do requires real work up-front - and recognition that what top performers decide and do is not merely technical, but includes areas of communication, organization, social relationships and more. It's also very important to apply learning engineering to the development of teacher/faculty/trainer expertise as well as student expertise: changing what teachers decide and do to match what we want students to experience is its own learning challenge, and teachers, too, need support and empathy for where they start and where they need to get to. This, too, requires iteration. Finally, I wish more people saw how the research on expertise development and learning supports real optimism: unlike many of our formal learning environments that look like a race against time, real expertise development is not a race against school or training calendars, but rather the accumulation of sufficient well-designed practice and feedback against well-defined outcomes over extended time. Some people can do this within the race-against-time conditions we mostly use, but many more can achieve their dreams through more extended practice-and-feedback more closely matched to what they bring and to their own motivations and passions.
Question: The COVID-19 pandemic has resulted in a massive, rushed shift to educational technologies. What do learning engineers and learning analytics researchers bring to this shift?
Answer: In the very short term, learning engineers collaborating closely with educators and others can make rapid judgments grounded in evidence to try to do the best work within the tight constraints of a sudden shift in conditions.
It’s a little like dropping a well-trained ER physician into a setting without all the technology - I'd still rather have him/her working on my health issues, than someone with no grounding in human health, although there may well be local experts to collaborate with to get an even better combined result.
Similarly, learning analytics professionals can try to quickly use current evidence sources to see how things are going, and potentially prioritize where to make changes first.
Over time, both sets of professionals can continue to help evolve practices and information flow, and the selection of new approaches, to match the contexts of learners and educators, perhaps doing a better and better job of disentangling what works for whom.
The very disruption we are experiencing can become a source of energy for innovation - everyone is pushed to work virtually, through technology, and so applications of learning engineering to both professional development and what students experience is pushed to the fore.
The collaboration ends up key - between learning engineers, learning analytics professionals, teachers, students, administrators, families, to keep looking for what works best for whom, and how to keep iterating and improving.