Learning analytics is the "measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs," according to the 1st International Conference on Learning Analytics and Knowledge.1 The NMC Horizon Report: 2012 Higher Education Edition notes that this promising set of practices and tools aims to "harness the power of advances in data mining, interpretation, and modeling to improve understandings of teaching and learning, and to tailor education to individual students more effectively."2 Finally, George Siemens and Phil Long have even proposed that learning analytics should ultimately be focused on disruption and transformation in education, changing the very nature of teaching, learning, and assessment as we know it.3
Indeed, learning analytics may hold great promise as a way to support learning assessment and as a higher education "movement." The potential of learning analytics to combine information from multiple and disparate sources, to foster more-effective learning conditions in real time, and to enable multiple focal points for analysis and improvement is enticing. However, even though learning analytics offers powerful tools and practices to improve the work of learning and assessment, well-considered principles and propositions for learning assessment should inform its careful adoption and use. Otherwise, learning analytics risks becoming a reductionist approach for measuring a bunch of "stuff" that ultimately doesn't matter. In my world, learning matters.
Where We've Been
In 1992, the American Association for Higher Education's Assessment Forum published 9 Principles of Good Practice for Assessing Student Learning.4 This seminal document has both described and promoted meaningful attributes of student learning assessment in higher education, and it has helped facilitate the "learning outcomes movement" in higher education, fueling curricular, instructional, and institutional reform in myriad ways. If learning analytics is ultimately to be a transformative set of practices and tools for improving student learning, then these assessment principles should inform the adoption of analytics. Although all nine principles deserve a deep reading, I will address here the relationship between the three most significant in regard to learning analytics.
Assessment Principle 1: The assessment of student learning begins with educational values. Just as learning assessment "begins with and enacts a vision of the kinds of learning we most value for students," so too should learning analytics. As this principle predicts, when educational mission and values are not considered carefully, assessment (similarly, analytics) "threatens to be an exercise in measuring what's easy, rather than a process of improving what we really care about." Without careful consideration of what we really care about, of the kinds of learning that are most significant for students, and of how we know when they have learned, learning analytics may measure a lot but will fail to measure what is important.
Assessment Principle 2: Assessment is most effective when it reflects an understanding of learning as multidimensional, integrated, and revealed in performance over time. The adoption of learning analytics too must be informed not only by what can be measured but also by what cannot. There will be limits in what learning analytics can do. In this vein, Siemens and Long have appropriately acknowledged that learning "is messy" and have warned that with learning analytics, "we must guard against drawing conclusions about learning processes based on questionable assumptions that misapply simple models to a complex challenge."5 The message here is important: not every aspect of learning can be captured by the powerful tool that analytics promises to be. Sometimes learning is ineffable! Therefore, multiple methods for assessing learning should be employed, including assessments that function as learning opportunities to support students' deep integration of knowledge, their personal development, and (hopefully!) their transformation over time.
Assessment Principle 8: Assessment is most likely to lead to improvement when it is part of a larger set of conditions that promote change. As this principle states, assessment alone changes very little; likewise, learning analytics cannot act alone in radically disrupting and transforming education. Assessment (when done well) is about the authentic and deep understanding and improvement of teaching and learning. Analytics is about using the power of information technology to see patterns of success (or failure) in learning. Combining the two might actually produce the seeds of transformation—a powerful inquiry into what supports authentic, deep, transformative learning for students.
Where We're Going
The seven underpinning propositions of Assessment 2020: Seven Propositions for Assessment Reform in Higher Education are especially relevant to learning analytics. They include positioning assessment as a central feature of teaching and learning; ensuring that students develop the capacity to self-assess and appropriately assess others; and focusing assessment on educational purposes (assessment for and as learning versus assessment of learning for accountability purposes).6 To highlight the important connections between assessment and learning analytics, I have embedded "learning analytics" parenthetically in the seven propositions and have added a summary of each:
Assessment [Learning Analytics] has most effect when:
- … assessment [learning analytics] is used to engage students in learning that is productive. Assessment and learning analytics focus students on the processes for and outcomes of learning and are themselves learning activities.
- … feedback is used to actively improve student learning. Informative and supportive feedback enables a positive attitude toward learning; students should receive specific information about how to improve.
- … students and teachers become responsible partners in learning and assessment [learning analytics]. Students progressively take responsibility; students develop the ability to judge their own and others' work; and there is an ongoing dialogue about assessment and learning analytics processes among faculty and students.
- … students are inducted into the assessment [learning analytics] practices and cultures of higher education. Such practices are carefully structured early in a course so that students can transition successfully, and the practices are responsive to students' diverse expectations and experiences.
- … assessment [learning analytics] for learning is placed at the centre of subject and program design. Assessment and learning analytics are included in the earliest stages of curriculum and course development and are organized holistically across subjects and programs.
- … assessment [learning analytics] for learning is a focus for staff and institutional development. Institutions and staff require learning opportunities to develop skills and structures for good assessment and good learning analytics practice.
- … assessment [learning analytics] provides inclusive and trustworthy representation of student achievement. Assessing integrated learning requires looking more broadly at a student's understanding and the disposition "that a student builds across the curriculum and co-curriculum, from making simple connections among ideas and experiences to synthesizing and transferring learning to new, complex situations within and beyond the campus."7
Help Me Help You
In a recent EDUCAUSE Review article, Randy Bass noted: "The learning we are coming to value most is not always where we are putting our greatest interest and effort in assessment, including the emerging discussions about 'learning analytics.'"8 Educational technologists and teaching, learning, and assessment professionals need to get together to ensure appropriate, principled uses of this potentially powerful technology so that it will support the kinds of learning we value. We should engage with each other to address questions about learning and assessment and about how analytics might support both: How can our practices ultimately support better learning for students? How might educational systems look different as a result of good learning assessment and analytics practices? What might be the ultimate results of educational disruption and transformation? What might we each contribute to this vision, and what might we all learn (or need to learn)? Three strategies can help us have this conversation in a sustained way.
First, we should enact and promote a collaborative scholarship of teaching, learning, assessment, and educational technologies. Teaching, learning, and assessment are all part of the same educational ecosystem, to which we should add educational technologies such as learning analytics. Our collective research and its implications should address this ecosystem. Second, we should intentionally break down existing institutional organizational silos that prevent us from working and learning together. Educational technologists, instructional designers, assessment specialists, faculty developers, and learning consultants must all work together, strategically, to move forward our work and develop a shared vision. Third, we all should continue to be highly aware of what aspects of learning cannot be measured, as well as the very real limits of technology for addressing the uniquely human and inherently social process that is learning.
Educational technologists seeking to take advantage of all that learning analytics may be able to offer should look back to the past work from the learning outcomes and assessment movement and look forward to the propositions for the future of learning assessment to ensure that analytics ultimately serves learning in a meaningful way and also serves meaningful learning. Let our understandings of learning and assessment continue to help pave the way for enabling educational disruption and transformation—and for improving what really matters.
- 1st International Conference on Learning Analytics and Knowledge, Banff, Alberta, February 27–March 1, 2011, as cited in George Siemens and Phil Long, "Penetrating the Fog: Analytics in Learning and Education," EDUCAUSE Review, vol. 46, no. 5 (September/October 2011).
- New Media Consortium and EDUCAUSE Learning Initiative, NMC Horizon Report: 2012 Higher Education Edition, p. 22.
- Siemens and Long, "Penetrating the Fog."
- American Association for Higher Education, 9 Principles of Good Practice for Assessing Student Learning, 1992.
- Siemens and Long, "Penetrating the Fog."
- David Boud, Assessment 2020: Seven Propositions for Assessment Reform in Higher Education (2010).
- Association of American Colleges and Universities (AAC&U), "Integrative Learning VALUE Rubric."
- Randy Bass, "Disrupting Ourselves: The Problem of Learning in Higher Education," EDUCAUSE Review, vol. 47, no. 2 (March/April 2012).