LASI Dispositional Learning Analytics Workshop

2-5pm, July 2, 2013, Stanford University
learningemergence.net/events/lasi-dla-wkshp

LASI-banner

GOALS

This LASI workshop does not assume any prior knowledge of learning dispositions, although background papers and resources are provided in advance for those who wish. The goal is to forge new connections between people already working in the field, and spark new conversations for the rest of LASI and beyond, hopefully, growing into new initiatives.

Goals:

  • introduce research on how students’ and educators’ dispositions to learning can shape outcomes
  • describe software tools grounded in that research, which enable the techniques to be deployed at scale
  • review the impact of such tools on learners and educators show how, as a by-product of web delivery, one can build quality datasets
  • describe how this data is amenable not only to traditional educational analysis, but explore the prospects for using machine learning and big data approaches
  • consider user interfaces which enable different stakeholders (eg. learners; educators; researchers) to interact with that data coherently
  • looking to the future, what ideas can we brainstorm for the tough questions this field faces, e.g. Can we develop dispositional analytics based on learners’ activity traces (rather than self-report)? Can we move from analytics, to recommendation engines able to make timely interventions for educators, or guidance to learners?

SPEAKERS

Organiser: Simon Buckingham Shum – Professor of Learning Informatics, Knowledge Media Institute, The Open University, Milton Keynes, UK

Ruth Deakin Crick – Reader in Systems Learning & Leadership, Centre for Systems Learning and Leadership, Graduate School of Education, University of Bristol, UK

Chris Goldspink – Director and Chief Scientific Officer, Incept Labs, Sydney, AUS

Nelson González – Co-Founder & Chief Strategy Officer, PiersonLabs, San Francisco, USA

Dave Paunesku – Founder, PERTS Lab, & Dept. Psychology, Stanford University, USA

PROGRAMME OVERVIEW

2pm: Welcome Simon Buckingham Shum

2.15pm: Dispositional Learning Analytics: Learning Power & Complex Systems Ruth Deakin Crick and Simon Buckingham Shum

2.35pm: Open discussion

2.45pm: Measuring & Changing Student Psychology Online: Lessons from a Scale-up Project Dave Paunesku

3.05pm: Open discussion

3.15pm: Layers, Loops and Processes: Multi-level Analytics in Learning Systems Ruth Deakin Crick, Chris Goldspink & Shaofu Huang

3.35pm: Open discussion

3.45pm: Learner Dispositions: Big Data Meets Focused Social Science Research — Nelson Gonzáles, Chris Goldspink and Ruth Deakin Crick

4.05pm: Open discussion

4.30pm: Close

SESSION DETAILS & RESOURCES

We grabbed a camcorder and iPad to capture the speakers in action (thanks Rebecca Ferguson!), so while it’s a bit lo-fi, viewed with the slides we hope this gives you a good feel for the event… enjoy!

2pm: Welcome — Simon Buckingham Shum

In this talk I refer to a recent talk by John Seely Brown at Reimagining Education, in which he discusses the importance of learning dispositions. Here’s the key clip:

FirefoxScreenSnapz852

2.15pm: Dispositional Learning Analytics: Learning Power & Complex Systems

Ruth Deakin Crick and Simon Buckingham Shum

Abstract: Theoretical and empirical evidence in the learning sciences substantiates the view that deep engagement in learning is a complex and dynamic process which involves a combination of learners’ identities, dispositions, values, attitudes and skills. When these are fragile, learners struggle to achieve their potential in conventional assessments, and critically, are not prepared for the novelty and complexity of the challenges they will meet in the workplace, and the many other spheres of life which require personal qualities such as resilience, critical thinking and collaboration skills. Furthermore, learning communities are themselves complex systems consisting of multiple interacting layers or sub-systems. We have identified three key processes in schools as learning communities, which constitute sub-systems of learning activities and ‘viewpoints’ from which to understand the system as a whole . These are  (i) leadership, including both community and school; (ii) teacher professional learning and (iii) student engagement in learning and achievement. Each involves a mix of human, social and technical factors that shape the activities that occur and the meaning that individuals attribute to these events. In a simple sense, almost everything interacts with everything else, and the organisation and relational dynamics of a school, including parents and community, interact with work inside its classrooms to advance or inhibit deep student learning. These create feedback loops – learning loops – which require rapid sensemaking, knowledge structuring and wisdom in use. To date, the learning analytics research and development communities have not addressed how these complex concepts can be modelled and analysed, and how more traditional social science data analysis can support and be enhanced by learning analytics.  We  report progress in the design and implementation of three learning analytics tools:  one based on a research-validated multidimensional construct termed “learning power” which informs learning at each of these levels through a visual analytic that has been shown to have a demonstrable impact on learners.  We refer to a rapid prototyping model of teacher enquiry which informs teacher learning and benefits from  the rapid feedback afforded by the learning power analytic, and we describe Perimeta, a hierarchical process modelling tool, borrowed from engineers, which provides a learning analytic for leaders and the whole system.  We describe the Learning Warehouse,  a learning analytics infrastructure for gathering data at scale, managing stakeholder permissions, the range of analytics that it supports from real time summaries to exploratory research, and the limitations and possibilities we have experienced in its use.  We conclude by summarising the ongoing research and development programme and identify some challenges of integrating traditional social science research, with learning analytics and modelling.

Background resources:

Simon Buckingham Shum and Ruth Deakin Crick (2012). Learning Dispositions and Transferable Competencies: Pedagogy, Modelling and Learning Analytics. Proc. 2nd International Conference on Learning Analytics & Knowledge (Vancouver, 29 Apr-2 May, 2012). ACM. pp.92-101.
Open Access Eprint: http://oro.open.ac.uk/32823 / Slides/Replay

Da Ros-Volseles, D., & Fowler-Haughley, S. (2007). Why Children’s Dispositions Should Matter to All Teachers. Beyond the Journal: Young Children on the Web.

Deakin Crick, R. (2012). Student Engagement: Identity, Learning Power and Enquiry – A Complex Systems Approach. In S. Christenson, A. L. Reschly & C. Wylie (Eds.), Handbook of Research on Student Engagement (pp. 675-694). New York: Springer.

Deakin Crick, R., & Yu, G. (2008). Assessing Learning Dispositions: Is the Effective Lifelong Learning Inventory Valid and Reliable as a Measurement Tool? Educational Research, 50, (4), 387-402.

Dweck, C. S. (2000). Self-Theories: Their Role in Motivation, Personality, and Development. New York: Psychology Press.

Dweck’s Mindset resource website: http://mindsetonline.com

Learning Power, ViTaL Partnerships website: http://www.vitalpartnerships.com/learning-power

Shaofu Huang (2013). Prototyping Learning Power Modelling in SocialLearn. Social Learning Analytics Symposium, The Open University, UK (June 20, 2013). Webcast/slides

2.35pm: Open discussion

2.45pm: Measuring & Changing Student Psychology Online: Lessons from a Scale-up Project

Dave Paunesku

Ironically, Carol Dweck was unable to join us on her home turf of Stanford since she was in London giving this very helpful introduction to her research into the impact of growth mindsets on learning. However we were delighted that one of her team, Dave Paunesku, whose PERTS Lab is exploring the use of web apps to scale up her research, was able to join us.

Abstract: Research shows that students’ mindsets—the ways students think about school and their own abilities—play a key role in their academic motivation and achievement (see Yeager & Walton, 2011). Resilient students understand why school is important; they trust their teachers and peers; and they understand that they can grow their abilities by working hard and by trying new strategies when they get stuck. Furthermore, the results of multiple, carefully controlled field experiments demonstrate that these mindsets can be changed and that students’ motivation and achievement improve when they are. The Project for Education Research That Scales (PERTS) has developed an online research platform to measure students’ mindsets and to deliver and test interventions that influence students’ mindsets using low-cost randomized controlled trials. It has also partnered with the Khan Academy to inject minimal versions of these interventions directly into online educational exercises. The results of this research confirm on a large scale that students’ mindsets play a robust role in academic achievement and that even brief interventions can affect these mindsets. The vast amounts of data collected through this large-scale approach provide unprecedented opportunities to learn about the specific contexts in which students’ mindsets matter most; they also reveal the need for new tools that will enable researchers and practitioners to more effectively interpret and act on these types of data.

Background resources:

David S. Yeager and Gregory M. Walton (2011). Social-Psychological Interventions in Education: They’re Not Magic. Review of Educational Research, vol. 81, no. 2, 267-301. DOI: 10.3102/0034654311405999. http://rer.sagepub.com/content/81/2/267. Eprint: PDF

PERTS Lab: https://www.perts.net

3.05pm: Open discussion

3.15pm: Layers, Loops and Processes: Multi-level Analytics in Learning Systems

Ruth Deakin Crick, Chris Goldspink & Shaofu Huang

Abstract: Schools are complex systems, with several layers, feedback loops and processes. Understanding the purpose of a school allows us to identify its system boundaries and the core processes which enable it to achieve its purpose. In the 21C, schooling purposes necessarily include the generation of student dispositions for, and competence in, self-directed learning, citizenship, eco-sustainability and employability as well as traditional knowledge and competence in particular domains.  Learning analytics enable us to enhance and render more visible the feedback loops in core processes of student learning, teacher learning and organisational learning through (i) rapid feedback, (ii) visualisation of complex patterns in data and (iii) integration of different types of data.  In this paper we present some of our most recent research, which builds on a foundation of learning dispositions research over the last decade.

Four current projects, in Australia and the UK, have produced evidence about patterns and relationships in the contextual variables which influence student learning and achievement. Together, these contribute to the construction of pedagogical models which enable the generation of learning design and analytics which are based on substantive, social science evidence about deep learning for individuals and organisations.  In particular they make use of structural equation modelling, causal loop modelling and hierarchical process modelling, which enable us to make judgements about the contextual variables which require pedagogical attention in the generation of teaching as learning design. The real world challenge of educational improvement  is relentlessly trans-disciplinary,  involving a complex interplay between social, historical, institutional, technical and personal factors. This presents a challenge both to theory and practice. Traditional data, for example PISA,  achieves comparability through the use of widely available proxy indicators, but lacks the depth and resolution needed to provide an understanding of the mechanisms driving the patterns it surfaces. Learning analytics promises new avenues through which to meet this challenge.  This session focuses on four models: student engagement, the internal structure of learning power, the dynamics of authentic pedagogy and a hierarchical process model for evaluating the wider outcomes of schooling.  It demonstrates the possibilities of learning analytics for stimulating self-directed change, based on these models.

Background resources:

Ruth Deakin Crick, Chris Goldspink & Margot Foster (2013). Telling Identities: Learning as Script or Design? Learning Emergence Discussion Paper (June, 2013). PDF

Ruth Deakin Crick, Steven Barr & Howard Green (2013). Evaluating the Wider Outcomes of Schools: Complex Systems Modelling. Learning Emergence Discussion Paper (June, 2013). PDF

Shaofu Huang (2013). Modelling Learning Dynamics in an Authentic Pedagogy Setting. Presented at the Systems Learning and Leadership Seminar, Graduate School of Education, University of Bristol, UK (May 24, 2013). Prezi Slides

3.35pm: Open discussion

3.45pm: Learner Dispositions: Big Data Meets Focused Social Science Research

Nelson Gonzáles, Chris Goldspink and Ruth Deakin Crick

Abstract: As an example of how the above work can converge, this final session brings together the educational policy and decision makers, the researchers and the analytics designers, to consider the scenario of combining dispositional data (reflecting both Deakin Crick’s ELLI, and Dweck’s Growth Mindset approaches introduced above, and for both learners and educators), with demographic, health and welfare data, other classroom process data from research, and conventional school attainment data. We will reflect on how a large dataset such as this has been built in collaboration with the South Australia Dept. Education and Child Development, and how we are starting to analyse it using data mining techniques and information visualization.

There is growing international interest in the effect of learner dispositions on learning achievement – both academic and lifelong (Buckingham Shum & Deakin Crick, 2012; Deakin Crick, 2012; Deakin Crick & Yu, 2008; Dweck, 2000; Goldspink & Foster, Forthcoming). However dispositions to learning has been described, theorized and measured in a number of different ways. Broadly dispositions are defined as a tendency to behave in a certain way or in the words of Da Ros-Voseles ‘frequent and voluntary habits of thinking and doing’ (Da Ros-Volseles & Fowler-Haughley, 2007). Such open definitions do not take us far in terms of an understanding of mechanisms and potentially include innate characteristics through to socially mediated and individually constructed senses of and beliefs about self as a learner. Indeed, indications of the importance of learner dispositions are coming from psychology, sociology, as well as neuroscience. Each of these disciplines theorizes the concept differently and implicates different mechanisms in the development, entrenchment and change of dispositions.  Recent research in South Australia has collected data using several different measures of dispositions linked to a sub-set of these conceptualisations. These include the belief based view of Dweck, the approach of Deakin Crick which argues that dispositions are at least partly the product of social process which is formative (and potentially transformative) of learner identity, and a measure developed by Goldspink which suggests that dispositions may also be associated with tolerance of ambiguity or uncertainty. This data available for individual students (N=4000) can also be linked to a data set which includes student demographic and social data recently assembled by the South Australian Department for Education and Child Development. This data set, called the ‘education and child development data bank’ links data from different social agencies concerned with child welfare including health and welfare data and indicators of early childhood experience, social and emotional wellbeing.

For this session we have extracted a large data set of de-identified data which includes the dispositions measures as well as a selected sub-set of the big data for analysis using tools developed by Pierson labs. The purpose is to trial these tools as a means of finding deep structures and relationships which predict different aspects of dispositions measured in different ways.  This should then support the development of a refined and focused research agenda to improve our understanding of and capacity to theorize the concept as well as to better understand its relationship with other research which has a bearing on how learners orientate and position themselves to the new and the novel.

Background resources:

Declara: http://declara.com / PiersonLabs: http://www.piersonlabs.com

Da Ros-Volseles, D., & Fowler-Haughley, S. (2007). Why Children’s Dispositions Should Matter to All teachers. Beyond the journal: young children on the web. [PDF]

Deakin Crick, R. (2012). Student Engagement: Identity, Learning Power and Enquiry – a complex systems approach. In S. Christenson, A. L. Reschly & C. Wylie (Eds.), Handbook of Research on Student Engagement (pp. 675-694). New York: Springer.

Deakin Crick, R., & Yu, G. (2008). Assessing learning dispositions: is the Effective lifelong learning inventory valid and reliable as a measurement tool? Educational Research, 50(4), 387-402.

Dweck, C. S. (2000). Self-Theories: Their role in Motivation, Personality, and Development. New York: Psychology Press.

Goldspink, C., & Foster, M. (Forthcoming). A Conceptual Model and Set of Instruments for Measuring Student Engagement in Learning. Cambridge Journal of Education.

4.05pm: Open discussion

4.30pm: Close

LearningEmergence.net Copyright © 2015