Simon Buckingham Shum, Darrall Thompson, Maimuna Musarrat (CIC, UTS)
Zhonghua Zhang (Assessment Research Centre, University of Melbourne)
Srecko Joksimovic (Teaching Innovation Unit, University of South Australia)
In response to the changing demands on citizens and the workforce, educational institutions are starting to shift their teaching and learning towards equipping students with knowledge, skills and dispositions that prepare them for lifelong learning. These have been termed 21st Century skills/competencies, Core/Soft Skills, General Capabilities, Graduate Attributes, etc. There is now a lot of activity in the school and higher education sectors tackling the challenge of tracking and assessing these competencies in practical ways. Learning Analytics should in principle have important contributions to make, providing computational support for tracking learner processes (not just products), beyond the classroom in more authentic settings, visualizing patterns, and providing rapid feedback to educators and learners (Buckingham Shum & Crick, 2016). This workshop provides the chance to learn about ongoing efforts to develop and validate “C21LA”, and the nature of the challenges if these are to make a systemic impact, including the pedagogical, assessment, technological and political factors that together define educational infrastructures.
21st century skills include the “Four Cs” (cf. Jefferson & Anderson, 2017) which are regularly referred to as creativity, critical reflection, communication and collaboration, Gardner’s “Five Minds” which also map to Thompson’s (2016) CAPRI model. However, there are many other lists that include other qualities such as learning dispositions, ethics and citizenship (e.g. Care, et al. 2018). While pedagogical shifts to equip students with these skills are certainly needed, that alone will not affect systemic change. A critical challenge is how these competencies can be tracked and assessed in meaningful ways, because assessment regimes drive educator and student behaviour. But since these skills are not easily quantifiable, need to be assessed over a period of time, and need to be displayed in interpersonal, societal and culturally valid contexts, traditional methods like observational or interview techniques are hard to apply at scale. Student self-report has an important place, but comes with obvious limitations. This has triggered significant educational research in the school and higher education sectors, but the potential of Learning Analytics is often not harnessed.
Learning Analytics should in principle have important contributions to make (cf. Buckingham Shum & Crick, 2016), for instance: providing computational support for tracking learner processes (not just products); tracking activity not only inside the classroom but outside, in more authentic settings; tracking activity not only online and also face-to-face (via multimodal sensors/analytics); providing rapid feedback to educators and learners to build metacognitive capabilities.
In recent years, learning analytics has been applied to develop more objective assessments for measuring some of the essential 21st century skills (e.g., ICT literacy – Learning in digital networks, Wilson, Gochyyev, & Scalise, 2016; Collaborative problem solving, Griffin & Care, 2015; Learning in online environments, Milligan & Griffin, 2016) which could not be objectively, reliably and validly assessed with traditional approaches. Researchers advocate that learning analytics and measurement science should be synthesized for facing the challenges of the assessment of the hard-to-measure 21st century skills (Wilson & Scalise, 2016).
This workshop provides the chance for participants to share, and learn about, ongoing efforts to develop “C21LA” tools, and critically, how we validate them (e.g. Milligan, 2018; Milligan & Griffin, 2016). The workshop will include some ‘show and tell’, but speakers will be asked to reflect critically on the challenges that remain for these to make a systemic impact, including the pedagogical, assessment, technological and political factors that together define educational infrastructures.
Proposed workshop structure
The workshop will run for three hours in 30-minute segments, each segment focusing on one tool. Each segment will have a presentation (15-20 minutes), followed by discussion (10-15 minutes). There will be a plenary discussion at the end.
Workshop Presenter Credentials
Simon Buckingham Shum is Professor of Learning Informatics at the University of Technology Sydney, where he is inaugural Director of the Connected Intelligence Centre. He has been active in shaping the field of Learning Analytics, and co-founded the Society for Learning Analytics Research.
Darrall Thompson is a Senior Lecturer and Learning Futures Fellow in the UTS Faculty of Design, Architecture and Building. His award-winning research and design thinking are embodied in the REVIEW platform, a criteria-based system used for enhancing assessment and evaluation capabilities among staff and students in universities and schools.
Zhonghua Zhang is a Research Fellow at the Melbourne Graduate School of Education in the The University of Melbourne. His research interests include assessment, educational measurement, and psychometrics. He has been leading a project which focuses on developing behavioral indicators from log stream data to measure students’ collaborative problem skill, which has been identified as one of the essential skills in the 21st century workplace.
Srecko Joksimovic is a Research Fellow at the School of Education and Data Scientist in Teaching Innovation Unit, University of South Australia. His research interests focus on exploring the symbiosis of human and artificial cognition to understand knowledge processes and their impact on society.
Maimuna (Muna) Musarrat is a Postdoctoral Research Associate at the UTS Connected Intelligence Centre, where she is working closely with the U@Uni Academy, researching the assessment of transferable skills in high school students using different tools.
Buckingham Shum, S. & Crick, R. D. (2016). Learning analytics for 21st century competencies. Journal of Learning Analytics, 3 (2), 6–21.
Care, E., Griffin, P. & Wilson, M. (2018). (Eds.) Assessment and teaching of 21st century skills: Research and applications. Springer
Jefferson, M. & Anderson, M. (2017). Transforming schools: Creativity, critical reflection, communication, collaboration. London: Bloomsbury
Griffin, P., & Care, E. (Eds.). (2015). Assessment and teaching of 21st century skills: methods and approach. Dordrecht: Springer.
Milligan, S. (2018). Methodological foundations for the measurement of learning in learning analytics. Proceedings of the 8th International Conference on Learning Analytics and Knowledge (LAK ’18). ACM, New York, NY, USA, pp. 466-470.
Milligan, S. and Griffin, P. (2016). Understanding learning and learning design in MOOCs: A measurement-based interpretation. Journal of Learning Analytics, 3(2), 88–115.
Thompson, D. (2016). Marks should not be the focus of assessment — but how can change be achieved? Journal of Learning Analytics, 3 (2), 193–212.
Wilson, M. & Scalise, K. (2016). Learning analytics: Negotiating the intersection of measurement technology and information technology. In J. M. Spector, B. B. Lockee, & M. D. Childress (Eds.), Learning, Design, and Technology: An International Compendium of Theory, Research, Practice, and Policy (Published in cooperation with AECT). New York: Springer.
Wilson M., Scalise, K., & Gochyyev, P. (2016). Assessment of learning in digital interactive social networks: A learning analytics approach. Online Learning, 20 (2), 97–119.