Home / Research / PhD in Learning Analytics

PhD in Learning Analytics

WelcomeUTS ContextToolsSkills & DispositionsScholarships & Applications

Welcome to the UTS:CIC Doctoral Program

Screen Shot 2015-07-21 at 2.53.05 pm

CIC’s doctoral program in Learning Analytics offers UTS Scholarships to start in Autumn and Spring sessions for both domestic students (i.e. who do not require a visa), and international students. Applicants may be awarded a UTS Scholarship from CIC’s quota, or will be supported by CIC to compete against applicants from across the university for a Scholarship.

Application Deadlines for Domestic Students

SESSION CLOSING DATE NOTE
Autumn 2019 CLOSED For commencement January 2019
Spring 2019 30 April 2019 For commencement July 2019
Autumn 2020 30 September 2019 For commencement January 2020

Application Deadlines for International Students

SESSION CLOSING DATE NOTE
Autumn 2019 CLOSED For commencement January 2019
Spring 2019 CLOSED For commencement July 2019
Autumn 2020 30 June 2019 For commencement January 2020

CIC’s primary mission is to maximise the benefits of analytics for UTS teaching and learning. The Learning Analytics Doctoral Program is part of our strategy to cultivate transdisciplinary innovation to tackle challenges at UTS, through rigorous methodologies, arguments and evidence. A core focus is the personalisation of the learning experience, especially through improved feedback to learners and educators.

As you will see from our work, and the PhD topics advertised, we have a particular interest in analytics techniques to nurture in learners the creative, critical, sensemaking qualities needed for lifelong learning, employment and citizenship in a complex, data-saturated society.

We invite you to apply for a place if you are committed to working in a transdisciplinary team to invent user-centered analytics tools in close partnership with the UTS staff and students who are our ‘clients’.

Please explore this website so you understand the context in which we work, and the research topics we are supervising. We look forward to hearing why you wish to join CIC, and how your background, skills and aspirations could advance this program.

CIC Doctoral Candidates

“At UTS we are proud to be rated the top young university in Australia and within the top 200 universities globally.” [learn more]

CIC reports directly to Professor Shirley Alexander, Deputy Vice-Chancellor and Vice-President, Education & Students — whose learning and teaching strategy, through a $1.2B investment in new learning spaces, is ranked as world leading. Data and analytics are a core enabler of the UTS vision for learning.futures. Personalised learning through analytics-powered feedback is a priority in the UTS 2027 Strategy that CIC leads, so your work here will be right at the forefront of this. It is rare to have a Learning Analytics research centre positioned so strategically in a university, reflecting the boldness of the UTS leadership.

Our primary audience is UTS, working closely with faculties, information technology and student support units to prototype new analytics applications. Since we are breaking new ground, developing approaches that have wide applicability, we disseminate this for research impact. As you can see from our projects, we conduct both internally and externally-funded work.

CIC works closely with key teams in UTS who support the improvement of teaching and learning, including the Institute for Interactive Media & Learning (IML), Higher Education Language & Presentation Support (HELPS), and the Information & Technology Division to ensure that our tools can operate and scale in UTS as required. The annual Learning & Teaching Awards showcase leading educator practice, while the Assessment Futures program is defining the contours of assessment regimes relevant to the real world.

While you are expected to take charge of your own tool development, CIC’s application developer may well be able to support you with some web, mobile or script development to enable your research.

While CIC is inventing new analytics tools, we are also interested in evaluating open source and commercial learner-facing applications that have interesting potential for analytics.

PhD projects often add to and learn from ongoing projects, so think about whether your work connects to mainstream tools used in UTS such as Blackboard, SPARK, ReView and A.nnotate, as well as more experimental products such as Glyma and Declara, and prototypes like AWA, CLA and Compendium.  You may bring expertise in particular data analysis tools. Those already in use in CIC include R, Weka, RapidMiner, ProM, Tableau.

Topic-specific technical skills and academic grounding that you will need for your PhD are specified in the PhD project descriptions, but there are some common skills and dispositions that we are seeking, given the way that we work.

  • CIC is committed to multidisciplinarity, which we hope will become transdisciplinary as we build enough common ground for the disciplines to inform or even transform perspectives. Thinking outside your ‘home turf’ is not easy or comfortable, but we are seeking people with an appetite to stretch themselves with new worldviews.
  • CIC is committed to user-centered participatory design of learning analytics tools, so you will need a passion for, and commitment to, working with non-technical users as you prototype new tools. We are seeking excellent interpersonal and communication skills in order to translate between the technical and educational worlds, and creative design thinking to help users engage with new kinds of tools. Ideally, you will already have had some design experience, but this can also be an area you want to learn.

Scholarships

Successful candidates will be eligible for a 3-year Scholarship of $35,000/pa for a full-time student (a substantial increase on the standard Australian PhD stipend of $25,849). To this, you may be able to add potential teaching income from the many opportunities to work with Master of Data Science & Innovation students. In addition, as far as possible, CIC will fund you to present peer-reviewed papers at approved, high-quality conferences.

Domestic students have their tuition fees covered by the Australian Government’s Research Training Program (RTP) Fees Offset Scholarship. Please note, all scholarships at UTS are dependent upon satisfactory progress throughout the three years.

We are also open to applications from self-funded full-time and part-time candidates, in which case you may propose other topics that fit CIC’s priorities.

Eligibility

To be eligible for a scholarship, a student must minimally:

  • have completed a Bachelor Degree with First Class Honours or Second Class Honours (Division 1), or be regarded by the University as having an equivalent level of attainment;
  • have been accepted for a higher degree by research at UTS in the year of the scholarship;
  • have completed enrolment as a full-time student

Additional requirements are detailed under each of the topic areas.

Selection Criteria

Appointments will be made based on the quality of the candidates, their proposals, the overall coherence of the team, the potential contribution to UTS student and educator experience, and the research advances that will result.

The criteria are specified under each of the topic areas, both generic and specific to advertised projects. Evidence will be taken from an applicant’s written application, face-to-face/video interview, multimedia research presentation at the interview, and references.

Applications

Applicants for a Studentship should submit:

  • Covering letter
  • Curriculum Vitae
  • Research Proposal, maximum 4 pages, applying for one of the advertised PhD topics

Please email your scholarship application as a PDF, with PhD Application in the subject line, to:

Gabrielle.Gardiner@uts.edu.au

Following discussion with the relevant potential supervisors, you will be required to go through the UTS application process as a formal part of the application.

To begin this formal application process, click here and complete the following steps:

  1. Scroll down to “Lodge your application”
  2. Click on the blue “Register and Apply” button
  3. When you reach the section asking you to select your course, enter ‘data science’ into the free text search and the CIC Doctor of Philosophy – C02062 should come up.

Deadline

The deadlines for applications are noted in the table above. However, there is an advantage to contacting us earlier to open discussions: you are strongly encouraged to get in touch with project leads informally in advance of that because if we like you, we will offer you a place as soon as we can, and you need to know where you stand.

So please get in touch with the Director if you have queries about CIC in general, and with the relevant supervisors about the topic of interest to you.

The UTS application form and further guidance on preparation and submission of your research proposal are on the UTS Research Degrees website.

PhD Topics

We invite scholarship applications to investigate the following topics, which are broad enough for you to bring your own perspective. If you have your own funding, then you may propose another topic which fits with CIC’s priorities.

1. Classroom Analytics2. Data interoperability and analytics for lifelong personalised learning3. Analytics for Collaborative Evidence-Based Reasoning4. Learning Analytics & Learning Design5. Writing Analytics for Deep Reflection

Multimodal Learning Analytics in the Classroom

Supervisors

Roberto Martinez-Maldonado and Simon Buckingham Shum

The Challenge

The learning analytics challenge for this PhD is to research, prototype and evaluate approaches to automatically capture traces of students’ activity, using multimodal analytics techniques to make sense of data from heterogeneous contexts. Depending on the trajectory that you take, examples of the questions that such a project could investigate include:

  • How can multimodal analytics approaches be applied to gain a holistic understanding of students’ activity in authentic learning spaces?
  • How can the insights of students’ activity in physical spaces be connected with higher-level pedagogies?
  • How can these insights promote productive behavioural change?
  • How can the teacher be supported with this information to provide informed feedback?
  • How can learners and teachers be supported with data in the classroom?
  • What are the ethical implications of rolling out analytics in the classroom?
  • How can this information support more authentic and holistic assessment?
  • What are the technical challenges that need to be overcome?
  • How do learning theories and learning design patterns map to the orchestration of such analytics tools?

Analytic Approaches

We are seeking a PhD candidate interested in working on designing and connecting Multimodal Learning Analytics solutions according to the pedagogical needs and contextual constraints of learning occurring across physical and digital spaces. Providing continued support in the classroom, for mobile experiences and using web-based systems has been explored to different extents and each poses its own challenges. An overarching concern is how to integrate and coordinate learning analytics in a coherent way. Synergies with educational frameworks and theories already drawn on by the CIC team will be encouraged, such as Learning Power (Crick et al, 2015; CLARA tool) Epistemic Cognition, and science and technology studies of information infrastructure. The Connected Learning Analytics toolkit is another candidate infrastructure.

Addressing these questions should lead to educationally grounded machine learning techniques that give insight into heterogeneous activity traces (e.g. Martinez-Maldonado et al, 2018), and the design and evaluation of teacher and/or student-facing dashboards that provoke productive sensemaking, and inform action (e.g. Martinez-Maldonado et al, 2012). We invite your proposals as to which techniques might be best suited to this challenge.

You will work in close collaboration with ‘clients’ from other faculties/units in UTS, and potentially industry partners, with opportunities for synergy with existing projects and tools as described on the CIC website. For more information about ongoing research in this area, please visit the CrossLAK website.

Examples that can help you understand the kind of research we are currently associated with this PhD topic include the following:

HealthSimLAK: Multimodal Learning Analytics meet Patient Manikins

High Performance Teamwork Analytics in Physical Spaces

Candidates

In addition to the skills and dispositions that we are seeking in all candidates, you should have:

  • A Masters degree, Honours distinction or equivalent with at least above-average grades in computer science, mathematics, statistics, or equivalent
  • Analytical, creative and innovative approach to solving problems
  • Strong interest in designing and conducting quantitative, qualitative or mixed-method studies
  • Strong programming skills in at least one relevant language (e.g. C/C++, .NET, Java, Python, R, etc.)
  • Experience with data mining, data analytics or business intelligence tools (e.g. Weka, ProM, RapidMiner). Visualisation tools are a bonus.

It is advantageous if you can evidence:

  • Experience in designing and conducting quantitative, qualitative or mixed-method studies
  • Familiarity with educational theory, instructional design, learning sciences
  • Peer-reviewed publications
  • A digital scholarship profile
  • Design of user-centred software

Interested candidates should contact Roberto.Martinez-Maldonado@uts.edu.au and Simon.BuckinghamShum@uts.edu.au with informal queries. Please follow the application procedure for the submission of your proposal.

References

Aljohani, Naif R. and Davis, Hugh C. (2012) Learning analytics in mobile and ubiquitous learning environments. In Proceedings of the 11th World Conference on Mobile and Contextual Learning: mLearn 2012, Helsinki, Finland.

Deakin Crick, R., S. Huang, A. Ahmed-Shafi and C. Goldspink (2015). Developing Resilient Agency in Learning: The Internal Structure of Learning Power. British Journal of Educational Studies 63(2): 121- 160.

Kitto, Kirsty, Sebastian Cross, Zak Waters, and Mandy Lupton. (2015). Learning analytics beyond the LMS: the connected learning analytics toolkit. In Proceedings of the 5th International Conference on Learning Analytics And Knowledge, Poughkeepsie, New York: ACM, pp. 11-15

Martinez-Maldonado, R., Clayphan, A., Yacef, K. and Kay, J. (2015) MTFeedback: providing notifications to enhance teacher awareness of small group work in the classroom. IEEE Transactions on Learning Technologies, TLT, 8(2): 187-200

Martinez-Maldonado, R., Kay, J., Buckingham Shum, S., and Yacef, K. (2017). Collocated Collaboration Analytics: Principles and Dilemmas for Mining Multimodal Interaction DataHuman-Computer Interaction, HCI, In Press.

Martinez-Maldonado, R., Yacef, K., Kay, J., and Schwendimann, B. (2012) An interactive teacher’s dashboard for monitoring multiple groups in a multi-tabletop learning environment.  International Conference on Intelligent Tutoring Systems, pages 482-492.

Data interoperability and analytics for lifelong personalised learning

Supervisors

Kirsty Kitto, Roberto Martinez-Maldonado and Simon Buckingham Shum

The Challenge

It is likely that people entering the workforce today will need to change jobs multiple times throughout their lifetime (CEDA, 2015). Many existing job roles are likely to be automated, but new roles in the workforce of the future are emerging all the time. Higher education is likely to be just the start of a person’s learning journey; many people will need to up-skill, re-skill and retrain throughout their careers. This means that already thorny problems like the recognition of prior learning are going to become key; how can we recognise existing skills, knowledge and competency when they come from a wide array of domains and environments?

Increasingly we see claims emerging that technology will help to personalise learning, building upon the existing strengths of a learner and helping them to bolster their weaknesses. Many Adaptive Learning systems are already in existence and build upon a long line of work in Intelligent Tutoring (Nye, Graesser, & Hu, 2014; Ma, Adesope, et al., 2014) and Recommendation systems built for EdTech (Manouselis, Drachsler, et al., 2011). These systems claim to identify existing weaknesses in learners and to then personalise the learning experience, providing an individual journey that is adapted specifically for them. But as Caulfield (2016) correctly claims: “we have personalisation backwards” if we are attempting to provide the same remedy for students who come from very different backgrounds. Many others have called attention to the long history of attempts to “optimize” learning (e.g. Watters, 2015; Kohn, 2016), pointing out that it does nothing to innovate on an “old-school model that consists of pouring a bunch o’ facts into empty receptacles” (Kohn, 2016). Also worrying, the loss of autonomy associated with a tool that tells students precisely what to do next leads to a loss of serendipity and will discourage the development of growth mindsets and an ability to thrive in situations of ambiguity and uncertainty (Deakin Crick et al, 2015). This PhD project will seek to tackle these problems head-on, by investigating ways in which technology solutions can be provided that help in the construction of personal learning journeys that help learners to build upon their existing knowledge, skills and backgrounds, and then demonstrate the achievement of capabilities and competencies that map into both formal educational systems and work-based selection criteria.

Underlying such a project, we require a way of providing the learner of the future with a Personal Learning Record Store (PLRS) that they can access and make use of for life. This project will seek to develop early prototypes of a PLRS that satisfies core use cases identified by you. It will be important to keep in mind the long-term legal, ethical, and social implications of a technology of this form, and so your project will be about more than just developing tech, you will need to keep the learner firmly in mind while solving core technical problems concerning  interoperability and learner facing learning analytics. Depending on the trajectory that you take, examples of the questions that such a project could investigate include:

    • What data needs to be stored in a PLRS in order to facilitate lifelong personalised learning pathways?
    • What form should a PLRS take to facilitate lifelong learning?
    • How could high level educationally relevant constructs be discovered from low-level clickstream data and then mapped to the attainment of key skills and competencies?
    • What new learning designs and patterns can be created to take advantage of the large amount of learning data stored in a PLRS?
    • How can xAPI profiles and recipes be used to ensure that learning data collected from multiple educational systems and workplaces are both syntactically and semantically interoperable in a PLRS?
    • What analytics would enable a learner to understand key weaknesses (and strengths) that are evidenced by the low-level data contained in their PLRS?
    • How can we map between identified curriculum documentation and the data stored in a PLRS?
    • What analytics will help lifelong learners to understand the data in their PLRS, and to order it appropriately?
  • How can selected data from a PLRS be pulled into e-Portfolios and curriculum vitae?

Analytic Approaches

The challenge of developing learning analytics for lifelong learning competencies is at a relatively early stage of development (Buckingham Shum & Deakin Crick, 2016). Early work with the Connected Learning Analytics (CLA) toolkit (Kitto, Cross et al., 2015, Bakharia, Kitto, et al., 2016) has demonstrated that it is possible to create interoperable data structures from apparently disparate data sources with careful consideration. This project will seek to extend those results by developing frameworks and use cases for personalised lifelong learning that take full advantage of the fact that learning can happen anywhere, at anytime, and in many different places.

Depending upon the emphasis that your research project develops, you will need to make use of emerging educational data standards such as xAPI (ADL, 2013) and IMS Caliper (IMS, 2015) and couple them with existing frameworks to ensure that PLRS data is interoperable despite being collected across many different places and contexts. A good place to start might involve investigating the way in which the xAPI concept of a Learning Record Store can be extended to enable individual learners to link them with existing organisational Information Systems (e.g. Student Information Systems, and Learning Management Systems). The World Wide Web Consortium’s (W3C) Resource Description Framework (RDF) Linked Data (LD) technology stack could also be used to ensure that concepts such as “course”, “award”, and “badge” map between example educational domains (e.g. Europe and Australia), which will enable data to travel between them as a learner moves between institutions from e.g. UTS to Oxford and then into an increasingly globalised workforce.

This project will help to progress our understanding of how we might be able to create an open source Learning Analytics API for any data stored in a PLRS that meets the requirements of specific identified xAPI recipes and profiles (ADL, 2016). This will help us to understand how learners might provide evidence for competency in 21st-century skills like “creativity” and “communication”, and other core graduate employability skills (Bridgstock,  2017) by pulling data from their PLRS. This project also offers an opportunity to work towards rethinking the way in which people might use extracurricular activities to add further weight to their claims of competency and achievement.

Candidates

In addition to the skills and dispositions that we are seeking in all candidates, you should have:

    • A Masters degree, Honours distinction or equivalent with at least above-average grades in computer science or equivalent
    • An analytical, creative and innovative approach to solving problems
    • A strong interest in data interoperability, linked data, SKOS, RDF, etc.
  • Strong programming skills in at least one relevant language (e.g. Python, C/C++, Java) and associated programming frameworks

It is advantageous if you can evidence:

    • Experience in designing APIs
    • Familiarity with at least one of Experience API (xAPI) and/or IMS Caliper
    • Experience with systems architecture and design
    • Peer-reviewed publications
    • A digital scholarship profile
  • Design of user-centred software

Interested candidates should contact Kirsty.Kitto@uts.edu.au and Roberto.Martinez-Maldonado@uts.edu.au with informal queries. Please follow the application procedure for the submission of your proposal.

References

ADL. (2013). xAPI-Spec. https://github.com/adlnet/xAPI-Spec, version 1.0.3 accessed 11/4/2017

ADL. (2016). Companion Specification for xAPI Vocabularies, https://adl.gitbooks.io/companion-specification-for-xapi-vocabularies/content/ , version 1.0 accessed 11/4/2017

Bakharia, A., Kitto, K., Pardo, A., Gašević, D., & Dawson, S. (2016, April). Recipe for success: lessons learnt from using xAPI within the connected learning analytics toolkit. In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (pp. 378-382). ACM.

Bridgstock, Ruth (2017). Graduate employability 2.0: Social Networks for learning, career development and innovation in the digital age. Available at:  http://www.graduateemployability2-0.com/

Buckingham Shum, S., & Deakin Crick, R. (2016). Learning analytics for 21st century competencies. Journal of Learning Analytics, 3(2), 6–21.  http://dx.doi.org/10.18608/jla.2016.32.2

Caulfield, Mike (2016) We have personalization backwards,  http://mfeldstein.com/we-have-personalization-backwards/

CEDA. (2015). Australia’s future workforce? Technical report, Committee for Economic Development of Australia (CEDA). http://www.ceda.com.au/research­and­policy/policy­priorities/workforce.

IMS. (2015). Caliper Analytics, http://www.imsglobal.org/activity/caliperram

Kitto, Kirsty, Sebastian Cross, Zak Waters, and Mandy Lupton. (2015). Learning analytics beyond the LMS: the connected learning analytics toolkit. In Proceedings of the 5th International Conference on Learning Analytics And Knowledge, Poughkeepsie, New York: ACM, pp. 11-15

Kitto, K., Lupton, M., Davis, K & Waters, Z.(2016). Incorporating student-facing learning analytics into pedagogical practice. In S. Barker, S. Dawson, A. Pardo, & C. Colvin (Eds.), Show Me The Learning. Proceedings ASCILITE 2016 Adelaide (pp. 338-347)

Kohn, Alfie (2016). The overselling of Education Technology, Edsurge: https://www.edsurge.com/news/2016-03-16-the-overselling-of-education-technology

Ma, W., Adesope, O., Nesbit, J.,  Liu, and Q. (2014). Intelligent tutoring systems and learning outcomes: A meta-analysis. Journal of Educational Psychology, 106(4), 901-918.

Manouselis, N., Drachsler, H., Vuorikari, R., Hummel, H., & Koper, R. (2011). Recommender systems in technology enhanced learning. In Recommender systems handbook (pp. 387-415). Springer US.

Nye, B. D., Graesser, A. C., & Hu, X. (2014). AutoTutor and family: A review of 17 years of natural language tutoring. International Journal of Artificial Intelligence in Education, 24(4), 427-469.

Sharples, Mike, and Jeremy Roschelle. (2010). Guest editorial: Special section on mobile and ubiquitous technologies for learning. IEEE Transactions on Learning Technologies, (1), pp. 4-6.

Watters, Audrey (2015). The algorithmic future of education. http://hackeducation.com/2015/10/22/robot-tutors

Analytics for Collaborative Evidence-Based Reasoning

Supervision Team

Simon Buckingham Shum (UTS:CIC) and Tim van Gelder (U. Melbourne)

This project is a collaboration between UTS:CIC and the University of Melbourne SWARM Project. The successful candidate will be based in Sydney, also spending time in Melbourne.

The Challenge

The real world challenge: improving collaborative EBR

The challenges facing society are so complex that multiple expertises are needed. Consider security, science, law, health, policy-making, finance. The team is the ubiquitous organisational unit, but the quality of its reasoning, especially under pressure, can vary dramatically. Problems are not provided in neat, well-defined packages: a team must frame problems in creative ways that lead to insights, and resolve uncertainties around possible responses, making the best possible use of evidence, plus their own judgement. We will term this whole process Evidence-Based Reasoning (EBR). (We note of course that politics and social dynamics are unavoidable whenever people work together, and effective team members learn how to navigate these dynamics effectively.)

Improving collaborative EBR is an interesting scientific and design challenge. A successful support system (i.e. ways of working + enabling tools) must respect the principles of good reasoning, as determined by fields such as logic, argumentation and epistemology, and the domain-specific knowledge (i.e. emergency response, engineering, social work, counter-intelligence, etc.). At the same time, it must accommodate the strengths, weaknesses and vagaries of human reasoners, which is the terrain of cognitive and social psychologists. If part of the support system is interactive software, then it must have a good user interface and a solid underlying architecture. Assessing the resulting performance is a difficult evaluation problem. Building such systems is therefore inherently multidisciplinary.

How do we better equip teams for collaborative EBR? From an educational perspective, teamwork, problem solving and critical thinking skills are now among the most in demand ‘transferable competencies’ (Buckingham Shum & Deakin Crick, 2012; ,Fiore et al 2018). The challenge of assessing and equipping graduates in these is at the heart of the learning and teaching strategies at UTS and U. Melbourne.

The technology support challenge:

While in some fields, there are specialist tools for modelling and simulation that assist analysts by managing constraints in the problem, but even with machine intelligence, the agency typically rests with the human analysts to decide how much weight to give to the machine’s output. Most other fields, however, do not have such tools: collaborative EBR is typically supported by general-purpose information technologies such as word processors, spreadsheets, databases, and project planners to help with managing information and producing reports. Similarly, generic communication tools dominate, such as email, chat, video conferencing, phone. In most cases, the reasoning itself is typically left wholly to the human reasoners themselves.

There have been remarkably few attempts to provide direct technological support for the processes of inference and judgement that are at the heart of collaborative EBR, and moreover, those attempts have had little impact on the way it is actually conducted in most places (van Gelder, 2012). There are methods and software tools for facilitating group processes and visualising team reasoning, but these require quite an advanced facilitation and software skillset (e.g. Culmsee and Awati, 2013; Okada, et al., 2008; Selvin et al, 2012).

Our interest is in developing computer-support to improve the collaborative EBR of geographically and often temporally distributed teams, that does not require specialist skills to start using beyond using what are now familiar collaboration tools. SWARM is an online platform emerging from an ongoing research project to improve the kind of collaborative EBR undertaken by intelligence analysts making sense of complex sets of qualitative and quantitative information or varying reliability. However, these are the conditions under which most other domains operate, and we hypothesise that it has broader potential, and specifically in this project, for education and training. SWARM is based on three design principles: cultivating user engagement, exploiting natural expertise, and supporting rich collaboration (van Gelder et al, 2018). Central to its approach is the upskilling of team members to equip them with different EBR skills (see in particular the Lens Kit).

Figure 1: The SWARM workspace

Recent large scale empirical evaluations, in which teams of analysts tackled complex challenges with or without SWARM, indicated that the quality of the reports produced by SWARM teams was significantly better than reports produced by analysts  using normal methods (van Gelder et al, In Prep). In a follow-up project, “super-teams” on the platform produced reasoning so good it would plausibly be called “super-reasoning” (van Gelder & de Rozario, 2017) analogous to “super-forecasting” (Tetlock & Gardner, 2015).

This CIC seminar is a great introduction to the work so far:

Learning Analytics for SWARM

The encouraging evidence of SWARM’s effectiveness makes it an attractive candidate platform for use in educational/training contexts. While evaluation of final reports (i.e. the team’s product) is a conventional measure of team performance, and certainly one that educators will be interested in, this is not the only possible indicator of improvement. The emergence of data science, activity-based analytics and visualisation opens new possibilities for tracking the process that teams are following. Learning Analytics connects such techniques to what is known about the teaching and learning of teamwork, and could make the assessment of team performance more rigorous, and more cost effective.

This PhD is therefore focusing on inventing and validating new forms of automated team analytics for collaborative EBR, to provide insights into both process and product. Such analytics might enable not only coaches and researchers to gain insights into a team’s effectiveness, but the teams themselves to monitor their work in real time, or critically review their project on completion. Further, real-time analytics can be used to shape the collaborative environment itself, resulting in better collaboration and better outputs.  Some prototype analytics have already been developed to summarise participants’ contributions and interactions. This PhD will build on this work, synthesise the literature, plus insights from the SWARM team and educators, in order to define, design, implement and evaluate automated analytics in different contexts, spanning education and training, research, and potentially more authentic deployments with professional teams.

Figure 2: Early version of the SWARM group dynamics dashboard. Upper diagrams shows levels of interaction among team members working on a particular problem.

Relevant analytics techniques include, but are not limited to:

  • Text analysis to identify significant contributions to the team communications and the report they are producing
  • Social network analysis to identify significant interaction patterns among team members
  • Process mining to identify significant sequences in the actions that individuals engage in, within or between sessions
  • Statistical techniques to identify significant differences between teams

Candidates

In addition to the broad skills and dispositions that we are seeking in all candidates (see CIC’s PhD homepage), you should have:

  • A Masters degree, Honours distinction or equivalent with at least above-average grades in computer science, mathematics, statistics, or equivalent
  • Analytical, creative and innovative approach to solving problems
  • Strong interest in designing and conducting quantitative, qualitative or mixed-method studies
  • Strong programming skills in at least one relevant language (e.g. R, Python)
  • Experience with web log analysis, statistics and/or data science tools.

It is advantageous if you can evidence:

  • Design and Implementation of user-centred software, especially data/information visualisations
  • Skill in working with non-technical clients to involve them in the design and testing of software tools
  • Knowledge and experience of natural language processing/text analytics
  • Familiarity with the scholarship in a relevant areas (e.g. high performance teams; collective intelligence; collaborative problem solving)
  • Peer-reviewed publications

Interested candidates should contact the team to open a conversation: Simon.BuckinghamShum@uts.edu.au; tgelder@unimelb.edu.au

We will discuss your ideas with you to help sharpen up your proposal, which will be competing with others for a scholarship. Please follow the application procedure for the submission of your proposal.

References

Buckingham Shum S. & Deakin Crick, R. (2016). Learning Analytics for 21st Century Competencies. Journal of Learning Analytics, 3, (2), pp. 6-21.

Culmsee, P. and Awati, K. (2013). The Heretics Guide to Best Practices: The Reality of Managing Complex Problems in Organisations. iUniverse.

Fiore, S. M., Graesser, A., & Greiff, S. (2018). Collaborative problem-solving education for the twenty-first-century workforceNature Human Behaviour, 2(6), 367–369.

Okada, A., Buckingham Shum, S. and Sherborne, T. (Eds.) (2008). Knowledge Cartography: Software Toolsand Mapping Techniques. London, UK: Springer. (Second Edition 2014)

Selvin, A. M., Buckingham Shum, S.J.and Aakhus, M. (2010). The practice level in participatory design rationale: studying practitioner moves and choices.Human Technology: An Interdisciplinary Journal of Humans in ICT Environments, 6(1), pp. 71–105.

Tetlock, P., & Gardner, D. (2015). Superforecasting: The Art and Science of Prediction.London: Random House.

van Gelder, T.J. (2012). Cultivating Deliberation for Democracy. Journal of Public Deliberation. 8 (1), Article 12.

van Gelder, T., & de Rozario, R. (2017). Pursuing Fundamental Advances in Human Reasoning. In T. Everitt, B. Goertzel, & A. Potapov (Eds.), Artificial General Intelligence(Vol. 10414, pp. 259–262). Cham: Springer International Publishing.

Tim van Gelder, Richard De Rozario, and Richard O. Sinnott (2018). SWARM: Cultivating Evidence Based Reasoning. Computing in Science & Engineering, Volume 20, Issue: 6, Nov.-Dec. 1 2018, pp.22-34. Downloadable from http://bit.ly/cultivatingEBR

van Gelder, T. J. et al (in preparation). Prospects for a Fundamental Advance in Analytical Reasoning.

Aligning Student-Facing Learning Analytics with Learning Designs

Kirsty Kitto, Roberto Martinez Maldonado and Simon Buckingham Shum (UTS:CIC)

The Challenge

A growing number of educational technology products and research prototypes are developing student-facing dashboards, intended to provide students with actionable insights into their progress (Schwendimann, Rodriguez-Triana et al., 2017). The state of the art is, therefore, at an immature stage of development. Few dashboards are grounded in educational principles (Jivet et al., 2018), and they are rarely integrated with learning design and assessment, to ensure that use of the dashboard fits coherently into the student activity (Kitto et al., 2017).

Lockyer, Heathcote, and Dawson (2013) contrast checkpoint analytics with process analytics. The former are rarely coupled with pedagogical approaches, and often consist of a one-step process: students engage in a class activity, analytics are made available to inform them about their participation, but students are not required to engage with or respond to this feedback in any way. In contrast, process analytics are designed to provide specific insights about how a student has engaged in a task. To date, few studies have investigated how students actually make use of student facing analytics (either checkpoint or process). One study conducted by Corrin and de Barbara (2014), demonstrated that students often fail to understand simple reports, and another by Khan and Pardo (2016) showed that while they often approach checkpoint analytics with initial interest, this quickly tails off. In other words, the failure to link the analytics to learning design means that students often fail to see what they mean to them.

We have been seeking to address these challenges in our recent work. Kitto, et. al, (2016, 2017) have demonstrated how to link student-facing analytics to learning design by requiring students to act on the dashboard feedback. Echeverria, et. al (2018) explore how dashboards can be contextualised to specific activities, by enhancing process visualisations with Data Storytelling features, that focus attention on the most important aspects of their activity, as defined by the teacher’s intended learning design. Martinez-Maldonado et al (2016; 2018) consider the state of the art and challenges in orchestrating collocated collaboration analytics with learning designs.

In this PhD project you will be challenged to develop new ways in which student facing analytics can be developed to help students navigate complex learning tasks, by focusing attention on learning-to-learn more effectively (Buckingham Shum & Crick, 2017):

  • uncovering information about how they behave in different learning tasks
  • reflecting on their weaknesses and strengths as they approach different problems
  • developing stronger metacognitive skills
  • reflecting on their practice and improve
  • improving their data literacy

In the future, could we imagine dashboards adapting to student activity, or changing each week to reflect each new assignment? What technical and usability challenges might this raise? We want to know in what directions you would take this project.

Analytics Approaches

A key requirement when selecting analytics approaches is that we seek to build higher order skills for learning-to-learn. How will the feedback you design equip students in this way? All PhD research in CIC is in partnership with one or more academics in UTS who will deploy the analytics tool with their students. Depending on the collaborations that we forge, the kind of learning activity must obviously match the kind of analytics feedback provided. Your expertise in specific techniques will obviously be a factor, e.g. any of those already in use in CIC (ensure you have browsed these), or new ones. Depending on how the project develops, there may be scope to integrate your analytics into existing student facing platforms, else to deploy a research prototype.

Methodologically, this project could cover a wide variety, according to your interests and prior skills. Candidates include protocols for lab-based experimental studies, field trials in classrooms, qualitative analysis of transcripts. There is no doubt that you will engage in quantitative analysis of digital traces left by learners, candidate methods for this include statistical and machine learning methodologies. In your proposal, you should consider which you would consider relevant, and whether you bring, or require training in, these methods.

Candidates

In addition to the skills and dispositions that we are seeking in all candidates (see guidelines), specifically for this project you should bring:

  • A Masters degree, Honours distinction or equivalent, with at least above-average grades, in education, the learning sciences, computer/data sciences
  • An analytical, creative and innovative approach to solving problems
  • An understanding of issues surrounding pedagogy, assessment, and data analysis
  • Strong interest in designing and conducting quantitative, qualitative or mixed-method studies

It is advantageous if you can evidence:

  • Software coding skills for dashboard implementation
  • Experience in the design of user-centred software and/or information visualisations
  • A strong interest, and ideally experience, in at least one learning analytics technique that you want to consider developing in this PhD
  • Peer-reviewed publications
  • A digital scholarship profile
  • Experience in designing and conducting quantitative, qualitative or mixed-method studies

All members of the supervision team are actively engaged in this topic, and well connected to the international research community, making this the ideal project for a PhD on this topic. Interested candidates are strongly encouraged to contact Kirsty.Kitto@uts.edu.au and Roberto.Martinez-Maldonado@uts.edu.au to discuss ideas informally. We aim to give you feedback on your suitability and on draft proposals.

Please follow the application procedure for the formal submission of your proposal.

References

Bakharia, A., Kitto, K., Pardo, A., Gašević, D., Dawson, S. (2016), Recipe for success: lessons learnt from using xAPI within the connected learning analytics toolkit. In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK16). ACM, New York, NY, USA, 378-382. [Open Access ePrint]

Bakharia, A., Corrin, L., de Barba, P., Kennedy, G., Gašević, D., Mulder, R., … Lockyer, L. (2016). A conceptual framework linking learning design with learning analytics. Proceedings of the Sixth International Conference on Learning Analytics & Knowledge, Edinburgh, 329-338.

Buckingham Shum, S. and Crick, R. (Eds.) (2016). Learning Analytics for 21st Century Competencies. Special Issue: Journal of Learning Analytics, 3 (2), 6-212. http://learning-analytics.info/journals/index.php/JLA/issue/view/381

Corrin, L., & de Barba, P. (2014). Exploring students’ interpretation of feedback delivered through learning analytics dashboards. Proceedings of the ASCILITE 2014 Conference, Otago.

Echeverria, V., Martinez-Maldonado, R.,  Granda, R., Chiluiza, K., Conati, C., and Buckingham Shum, S. (2018) Driving Data Storytelling from Learning Design, In Proceedings of the Eighth International Learning Analytics & Knowledge Conference (LAK ’18). ACM, New York, NY, USA. https://doi.org/10.1145/3170358.3170380 [Open Access ePrint]

Gibson, A., Kitto, K., Bruza, P. (2016). Towards the Discovery of Learner Metacognition From Reflective Writing. Journal of Learning Analytics, 3(2), 22-36. http://dx.doi.org/10.18608/jla.2016.32.3

Jivet, I., Scheffel, M., Specht, M. and Drachsler, H. (2018). License to evaluate: Preparing learning analytics dashboards for educational practice. In Proceedings of International Conference on Learning Analytics and Knowledge, Sydney, NSW, Australia, March 7–9, 2018 (LAK ’18), 10 pages. https://doi.org/10.1145/3170358.3170421

Khan, I., & Pardo, A. (2016). Data2U: Scalable real time student feedback in active learning environments. Proceedings of the Sixth International Conference on Learning Analytics & Knowledge, Edinburgh, 249-253.

Kitto, K., Buckingham Shum, S., Gibson, A., (2018). Embracing Imperfection in Learning Analytics, In Proceedings of the Eighth International Learning Analytics & Knowledge Conference (LAK ’18). ACM, New York, NY, USA, In press (Accepted 21/11/2017). [Open Access ePrint]

Kitto, K., Lupton, M., Davis, K., Waters, Z. (2017). Designing for Student Facing Learning Analytics, Australasian Journal of Educational Technology, 33(5), 152-168. [Open Access ePrint]

Kitto, K., Lupton, M., Davis, K., Waters, Z. (2016). Incorporating student-facing learning analytics into pedagogical practice. In S. Barker, S. Dawson, A. Pardo, & C. Colvin (Eds.), Show Me The Learning. Proceedings ASCILITE 2016 Adelaide, pp. 338-347. [Open Access ePrint]

Kitto, K., Cross, S., Waters, Z., Lupton, M. (2015). Learning Analytics beyond the LMS: the Connected Learning Analytics Toolkit. In Proceedings of the Fifth International Conference on Learning Analytics and Knowledge (LAK15). ACM, New York, NY, USA, 11-15. [Open Access ePrint]

Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing pedagogical action: Aligning learning analytics with learning design. American Behavioral Scientist, 57(10) 1439–1459. [Open Access ePrint]

Martinez-Maldonado, R., Buckingham Shum, S., Schneider, B., Charleer, S., Klerkx, J., and Duval, E. (2017). Learning Analytics for Natural User InterfacesJournal of Learning Analytics, 4(1), (March 2017): 24-57.

Martinez-Maldonado, R., Kay, J., Buckingham Shum, S., and Yacef, K. (2018, In Press). Collocated Collaboration Analytics: Principles and Dilemmas for Mining Multimodal Interaction DataHuman-Computer Interaction.

Schwendimann, B. A., Rodriguez-Triana, M. J., Vozniuk, A., Prieto, L. P., Boroujeni, M. S., Holzer, A., Gillet D,  & Dillenbourg, P. (2017). Perceiving learning at a glance: A systematic literature review of learning dashboard research. IEEE Transactions on Learning Technologies, 10(1), 30-41.

Wise, A. F. (2014). Designing pedagogical interventions to support student use of learning analytics. Proceedings of the Fourth International Conference on Learning Analytics and Knowledge, Indianapolis, IN, 203-211. https://doi.org/10.1145/2567574.2567588

Writing Analytics for Deep Reflection

Supervisors

Ming Liu and Simon Buckingham Shum (UTS:CIC), Cherie Lucas (UTS:Pharmacy)
The supervision team for this PhD is a partnership between CIC and the School of Pharmacy, who together have pioneered reflective writing analytics.

The Challenge

The societal challenge:

“We do not learn from experience… we learn from reflecting on experience.”
John Dewey

The problems now confronting society place an unprecedented urgency on learning from experience. Such is the pace of change that before we can plan for them, citizens and professionals in all sectors find themselves immersed in novel, complex problems. Moreover in education, the learning sciences tell us that crafting authentic experiences is a powerful trigger for learning.

“White water is the new normal”, as they say. But as Dewey noted, critical to this is our capacity to reflect on the experience of shooting those rapids. If we can’t learn how we could do better next time — individually and collectively — we are in deep trouble. From school age students, through higher education, and into professional leadership, we have to make sense of challenging experiences, recognise how we were challenged, how we are changing, and how we can improve.

At the heart of deep learning is our sense of identity. People rarely shift from entrenched positions by force of argument alone. However, when we undergo deeply challenging experiences that force us to question assumptions and worldviews at the heart of our identity, this can indeed be transformational if we are assisted in making sense of this, and can emerge with our identify intact but now under reconstruction. Without such shifts, it’s hard to see how we will move beyond current polarisations around how we relate to each other, and the planet. Given our current political and cultural climate, applied research to help people reflect on how they adjust to threatening transitions is both timely, and of first order importance.

So, we need to get better at deep reflection, and clearly, there’s nothing as valuable as detailed feedback to provoke further reflection. But this is a scarce skillset and very labour-intensive. The practical consequence is that most students and leaders do not understand what good reflective writing is, and do not receive good feedback. For these reasons, there’s interest in educational and professional sectors in the potential of automated techniques to deliver real-time, personalised coaching.

In sum, this PhD is fundamentally about harnessing computational intelligence to deepen human learning in contexts spanning formal education, professional practice, and community transformation.

The writing challenge:

Effective written communication is an essential skill which promotes educational success for university students. However, far too many students have never had the features of good rhetorical moves explained well to them, and most educators are subject matter experts, not skilled writing coaches (Lucas, Gibson & Buckingham Shum, 2018). CIC initiated its Academic Writing Analytics (AWA) project in 2015, as it became clear through consultations across faculties that student writing was a strategically important area for UTS teaching and learning (and indeed, for most other educational institutions). The goal is to more effectively teach the building blocks of good academic writing by providing instant, personalised, actionable feedback to students about their drafts (Knight, Buckingham Shum, Ryan, Sándor, & Wang, 2018).

To deliver on this vision requires integrated expertise including natural language processing, linguistics, academic language pedagogy, learning design, feedback design, user experience, and cloud computing. This is truly a transdisciplinary effort, which has been enormously stimulating. To date, we have worked on critical, argumentative, analytical writing of the sort typically found in literature reviews, persuasive essays and research articles, as well as reflective writing, in which learners make sense of their workplace experiences, try to integrate this with their academic understanding, and share their own uncertainties, emotions and sense of personal challenge/growth (Gibson, Aitken, Sándor, Buckingham Shum, Tsingos-Lucas, & Knight, 2017).

Learning Analytics tools are most effective when co-designed with effective Learning Designs: the features constructed by the analytics align with the assessment criteria, and the tool is coherently embedded in authentic student learning tasks. Our program has demonstrated how this can be accomplished (Knight, Shibani & Buckingham Shum, 2018; Shibani, Knight, Buckingham Shum & Ryan, 2017).

Depending on your interests and skillset, critical advances that this PhD might make include:

    • Integration between rule-based modelling and machine learning approaches
    • Accelerated customisation of the parsers to different disciplinary domains and genres of writing
    • Design and validation of analytics-augmented learning design patterns
    • Definition of new computational proxies that can serve as indicators of deep reflection
    • User experience and machine learning to enable user feedback that teaches the tool when it make errors
    • Curation of text corpora to advance the field
    • Radical improvements in the user experience through co-design techniques and user interface technologies

Analytics Approaches

We currently implement the underlying concept matching model using a rule-based grammar and human-curated lexicons, which for those not familiar with this kind of work, brings both pros and cons (Buckingham Shum, Sándor, Goldsmith, & McWilliams 2017; Ullmann, 2017). The rules are grounded in scholarly literature on the features of academic research writing, and have been tested on diverse texts by the team through close manual analysis. The lexicons can be edited to tune them to the language used in different disciplines and subjects. This relatively traditional AI approach provides familiar intellectual credentials when introducing the system to educators, and when we’re testing it, the underlying behaviour is easier to explain, and errors can be diagnosed very precisely. However, it brings the limitations associated with any rule-based approach: given the richness of open-ended reflective writing, there are exception cases to debug, and improvements to the system’s performance require manual edits to the rules and lexicon.

We are now beginning work to investigate if a machine learning approach can augment the current infrastructure (Liu, et al. 2019; Ullmann, 2019). Recent years, with the availability of “big data”, such as large question answer banks (Rajpurkar, Zhang, Lopyrev, & Liang, 2016), and effective machine learning algorithms, e.g. deep neural networks (Lecun, Bengio, & Hinton, 2015), data driven approaches based on new data processing architectures have attracted a great attention in natural language processing tasks, such as neural text summarization (Liu & Manning, 2017) and neural machine translation (Bahdanau, Cho, & Bengio, 2014), mainly because these approaches do not require human defined rules and have good generalization power. However, such data driven approaches require a large amount of data, and some statistical learning models such as deep neural networks are not easy to comprehend. Preliminary results are reported

Therefore, we invite your proposals as to which techniques might be best suited to this challenge. What’s more, the creation of a corpus for writing raises ethical challenges, and we invite your thoughts on what these are, and how we might address them.

You will work in close collaboration with one or more academics from other faculties/units in UTS, using co-design methods with academics, potentially external partners, with opportunities for synergy with existing projects and tools as described on the CIC website. For more information about ongoing research in this area, please visit the Academic Writing Analytics homepage and the Writing Analytics blog.

Resources to help you understand the current state of the technology and its educational applications include the references cited, plus:

Open source release of writing analytics infrastructure

Analytical Writing (Standard)

Improve sample text plus peer discussion (Civil Law)

Improving research abstracts/ intros

Improving pharmacy students’ reflective writing

Candidates

In addition to the broad skills and dispositions that we are seeking in all candidates (see CIC’s PhD homepage), you should have:

    • A Masters degree, Honours distinction or equivalent with at least above-average grades in computer science, mathematics, statistics, or equivalent
    • Analytical, creative and innovative approach to solving problems
    • Strong interest in designing and conducting quantitative, qualitative or mixed-method studies
    • Knowledge and experience of natural language processing/text analytic
    • Strong programming skills in at least one relevant language (e.g. C++, .NET, Java, Python)
    • Experience with statistical and data mining, deep learning, or data science tools (e.g. R, Weka, Tensorflow, ProM, RapidMiner).

It is advantageous if you can evidence:

    • Skill in working with non-technical clients to involve them in the design and testing of software tools
    • Familiarity with the scholarship of writing
    • Familiarity with educational theory, instructional design, learning sciences
    • Peer-reviewed publications
    • Design and Implementation of user-centred software

Interested candidates should contact the team to open a conversation: Ming.Liu@uts.edu.au; Simon.BuckinghamShum@uts.edu.au; Cherie.Lucas@uts.edu.au  We will discuss your ideas to help you sharpen up your proposal, which will be competing with others for a scholarship. Please follow the application procedure for the submission of your proposal.

References

Bahdanau, D., Cho, K., and Bengio, Y. (2014). Neural Machine Translation by Jointly Learning to Align and Translate. In Proceedings of the 3rd International Conference on Learning Representations

Buckingham Shum, S., Sándor, Á.,Goldsmith, R.,Bass R.,and McWilliams M.(2017). Towards Reflective Writing Analytics: Rationale, Methodology and Preliminary Results. Journal of Learning Analytics, 4, (1), 5884.

Gibson, A., Aitken, A., Sándor, Á., Buckingham Shum, S., Tsingos-Lucas, C.,and Knight, S. (2017). Reflective Writing Analytics for Actionable Feedback. In Proceedings of LAK17: 7th International Conference on Learning Analytics & Knowledge.

Knight, S., Shibani, A. and Buckingham Shum, S. (2018). Augmenting Formative Writing Assessment with Learning Analytics: A Design Abstraction Approach. London Festival of Learning (ICLS/AIED/L@S Tri-Conference Crossover Track), London (June 2018).

Knight, S., Buckingham Shum, S., Ryan, P., Sándor, Á.,and Wang, X. (2018). Designing Academic Writing Analytics for Civil Law Student Self-Assessment. International Journal of Artificial Intelligence in Education, 28, (1), 1-28.

Liu, M., Buckingham Shum, S., Mantzourani, E. and Lucas, C. (2019). Evaluating Machine Learning Approaches to Classify Pharmacy Students’ Reflective StatementsProceedings AIED2019: 20th International Conference on Artificial Intelligence in Education, June 25th – 29th 2019, Chicago, USA. Lecture Notes in Computer Science & Artificial Intelligence: Springer

Lucas, C., Gibson, A. and Buckingham Shum, S. (2018). Utilization of a novel online reflective learning tool for immediate formative feedback to assist pharmacy students’ reflective writing skills. American Journal of Pharmaceutical Education

Lecun, Y., Bengio, Y.,and Hinton, G. (2015). Deep LearningNature, 521(7553): 436–444. 

Rajpurkar, P., Zhang,J., Lopyrev, K., and Liang P. (2016). SQuAD: 100,000+ Questions for Machine Comprehension of Text. In Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing

See, A., Liu, P. J., and ManningC. D. (2017). Get To The Point: Summarization with Pointer-Generator Networks. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics

Shibani, A., Knight, S., Buckingham Shum S. and Ryan, P. (2017). Design and Implementation of a Pedagogic Intervention Using Writing Analytics. In Proceedings of the 25th International Conference on Computers in Education. New Zealand: Asia-Pacific Society for Computers in Education.

Ullmann, T.D. (2019). Automated Analysis of Reflection in Writing: Validating Machine Learning Approaches. Int. J. Artif. Intell. Educ. 1–41.

Top