Learning to Learn – International Perspectives

Learning to learn is crucial for success in our complex, unpredictable and data-drenched world.

L2LfrontcoverThis new book from members of the Learning Emergence Network explores learning to learn from theory and practice around the world.  See our people pages for many of the authors. 

Learning to learn is both a process and an outcome of formal education, along with other trans-disciplinary and life-wide competences. It goes deep into pedagogy and practice and is influenced by culture and context. As an outcome, it is a competence we aspire to measure and celebrate.

Learning how to learn is a crucial competence for human flourishing in 21st century conditions of risk and uncertainty.  It is one of eight key competencies identified by the European Union as a key goal within the Lisbon and the 2020 strategies (European Council 2006). The European Union maintains a keen interest in this topic as demonstrated by the European network of policy makers and several working groups on key competencies, including the creation of the European Network on Learning to Learn (Hoskins & Fredriksson, 2008). Internationally, learning to learn is emerging as a focus for school improvement and as a foundation for lifelong and lifewide learning. UNESCO includes approaches to learning as a key domain which should be an entitlement for all children, and one which needs to be assessed.

Language matters.

There is a real need for serious debate about the term ‘learning to learn’ which is frequently used in different ways and in different contexts without clear definition.  Often it is used within a conceptually narrow framework, limited to “measurable” study strategies and learning styles (OECD 2009) for which there is little evidence of success. There is an urgent need for a research validated foundation for learning to learn and what constitutes it.

Practitioners, university lecturers, teachers and schools around the world are interested in their students becoming able to take responsibility for their own learning and achievement – and for this they need to learn how to learn.  Existing funds of knowledge are all ‘out there on the internet’ and what matters is how individuals and teams make sense out of and utilise the mass of information which bombards them every day. Dialogue between research and practice is crucial to underpin this movement, generating a discipline of research-informed practice which frames and informs both commercial and policy interests. In the absence of a ‘pensee unique’  the global community of scholarship in education provides an important voice which should make a healthy, collaborative contribution to the formation of policy and practice.

Assessment of competence in learning to learn is a critically important policy ideal – one which the European Union embraced and embarked upon with Learning to Learn working group. After some serious effort we came to the conclusion that there are so many different approaches to learning to learn from across the EU, that it was impossible in 2007 to arrive at a consensus in its measurement. Before we can ever effectively assess something we need to know exactly what it is we are measuring – as a matter of professional ethics. We also need to know what measurement models are most suitable and what is the purpose of the assessment before we develop our assessment technologies. This book was conceived by people who participated in that EU project and, we hope, in an important way it keeps the dialogue alive.

Complexity and Learning to Learn

Learning to learn is a complex process rather than either a simple or even a complicated one.  Demetriou’s chapter explores an architecture of mind that incorporates four inter-related systems all of which may be relevant to learning to learn. Each contributor proposes a complex mix of processes that coalesce into learning to learn – including affective, cognitive and dispositional factors. All agree that learning to learn is about the promotion of self-directed learning, the cultivation of intrinsic motivation for learning and the development of intentional agency on the part of the learner.  All agree that contextual factors – such as pedagogy, assessment regimes, quality of relationships and socio-cultural factors – together interact and influence the ability of an individual to learn how to learn and to become an agent in their own learning journey. Learning to learn is messy and complex.

The implications of this complexity are enormous. edgar morinAs Edgar Morin argues (and Jung before him), Western thought has been dominated by the principles of disjunction, reduction and abstraction. Engaging with learning to learn as a complex process requires a paradigm of distinction-conjunction, so that we can distinguish without disjoining and associate without identifying or reducing.  In short we need to develop new and more holistic ways of understanding, facilitating and enabling learning to learn in our education communities, so that we can hold in tension the inner personal aspects of agency, purpose and desire and dispositions and the more measurable external and public manifestations of learning and performance and collaboration with others in learning to learn. We need measurement models that can account for quality of trust as a core resource, and story as a vehicle for agency as well as the more traditional and familiar measures of performance and problem solving.

Becoming self-organising agents in our own lives

If learning to learn is about human beings becoming self-organising agents of their own lives, as our contributors suggest, then it is clear that ‘top down’, transmission oriented approaches to learning, teaching and school improvement are no longer enough. The challenge is how to create the conditions in which individual students are able to take responsibility for their own learning over time.  By definition, this cannot be done for them. It has to be by invitation, allowing learning to learn to emerge and fuel agency and purpose.

The establishment of the framework for international comparison of educational achievement provided by the OECD through the Program for International Student Assessment (PISA) and the means for regularly compiling the data is a considerable achievement. It has provided an evidence base for Governments to inform domestic educational policy and against which to allocate priorities. What this data set is less effective at revealing are the reasons behind international and regional difference: we still understand too little about what drives these broad numbers. Furthermore the numbers continue to reveal deep, intractable challenges in education such as embedded disadvantage linked to geography, economics and ethnicity.

There is a pressing need to assemble an internationally comparable set of data which can better inform our understanding of factors such as learning how to learn and how this varies within and between different contexts. The academic and theoretical work that has been undertaken on these issues to date, while rich and deep, has focused on aspects of the problem, often failing to cross disciplinary boundaries. The real world challenge of educational improvement, meanwhile, is relentlessly trans-disciplinary, involving a complex interplay between social, institutional and individual factors. It presents a challenge both to theory and practice. The PISA data by comparison achieves comparability through the use of widely available proxy indicators but lacks the depth and resolution needed to provide an understanding of the mechanisms driving the patterns it surfaces.

Valuing Difference

What is also clear from this volume is the value of different cultures in the debate about learning to learn. Two chapters are written explicitly from an Eastern perspective  – demonstrating how Confucian philosophy can enrich our understanding of learning to learn and challenging some deeply held Western assumptions.  We have contributions from Australia, New Zealand, Finland, UK, Spain, Austria, China, Italy and the USA and uniquely, a set of case studies from learning to learn projects in remote Indigenous communities where the cultural differences are enormous. This  is a ‘brolga’ a community  metaphor for creativity for children in Daly River School, in Northern Territory.
Creativity

However comprehensive, this volume does not address a number of research and practice themes or leaves unanswered questions for further research. Among these, perhaps the most relevant is the road towards the assessment of learning to learn which is a daunting endeavour – although it provides a foundation for this through its contribution in exploring what it is that should be assessed in learning to learn and why. Other open questions concern the deployment of learning to learn in school improvement; in the training of trainers, educators and educational leaders; in personal development and empowerment. The connection of learning to learn with other key competencies, such as active citizenship and entrepreneurship, also requires further study.

This book draws on a rich, global tradition of research and practice. It is written by researchers and  practitioners who care deeply about education and about learning how to learn in particular. Our purpose is to generate debate, to link learning communities and to make a contribution to the ways in which societies worldwide are seeking to re-imagine their education systems. Our hope is that learning to learn will soon find a consistent place in educational policies worldwide.

Learning Dispositions + Authentic Inquiry in a Primary School

What happens when you turn a curriculum topic over to 10-11 year old children,  give them freedom to choose their focus, and increasing autonomy to make their own decisions to design, create and run a showcase event? Indeed, how do staff cope with stepping back like this? If Ofsted inspectors were to walk in, how could the school evidence learning? How can you evidence the development of lifelong learning dispositions, and how does this relate to the school’s strategic concerns about the progress of different pupil groups on traditional attainment measures? What roles do social learning tools like reflective blogging have to play?

This movie provides a brief glimpse into a two year series of pilots at Bushfield School, documented in more detail in this report. It represents the convergence of both University of Bristol and Open University research and development into learning analytics that can evidence processes associated with deeper learning, especially dispositional analytics (learn more: replay talk / workshop).

(See Reports for the entire library of school case studies.)

Screen Shot 2014-04-25 at 15.43.46Small, T., Shafi, A. and Huang, S. (2014) Learning Power and Authentic Inquiry in the English Primary Curriculum: A Case Study, Report No. 12, ViTaL Development & Research Programme, University of Bristol. [pdf]

This report documents progress in a two-year action-research programme at Bushfield School, Milton Keynes, with two main purposes: firstly, to build on the School’s success in developing children’s capacity to learn; secondly, to track and measure the impact of its interventions for this purpose. The school combined the Effective Lifelong Learning Inventory (ELLI) with the Authentic Inquiry learning methodology from University of Bristol. Qualitative and quantitative data are combined to examine the impact of the pilots from the perspective of staff and pupils, comparing learning power against a range of demographic and attainment datasets, in the distinctive context of a primary school already experienced in the Building Learning Power approach.

Assessing learning dispositions/academic mindsets

A few years ago Ruth and I spent a couple of days with the remarkable Larry Rosenstock at High Tech High, and were blown away by the creativity and passion that he and his team bring to authentic learning. At that point they were just beginning to conceive the idea of a Graduate School of Education (er… run by a high school?!). Yes indeed.

Screen Shot 2014-02-28 at 16.56.56Now they’re flying, running the Deeper Learning conference in a few weeks, and right now, the Deeper Learning MOOC [DLMOOC] is doing a great job of bringing practitioners and researchers together, and that’s just from the perspective of someone on the edge who has only managed to replay the late night (in the UK) Hangouts and post a couple of stories. Huge thanks and congratulations to Larry, Rob Riordan and everyone else at High Tech High Grad School of Education, plus of course the other supporting organisations and funders who are making this happen.

Here are two of my favourite sessions, in which we hear from students what it’s like to be in schools where mindsets and authentic learning are taken seriously, and a panel of researcher/practitioners:

Week 7 coming up is focused on Assessing for Deeper Learning, and I’m very much looking forward to hearing from the panellists. So in this post, I’d like to share some thoughts about how we go about assessing what some of us call lifelong learning dispositions, while others refer to academic mindsets, and welcome comments from DLMOOCers and others.

First, why do we want to assess mindsets?

In education, assessment serves many purposes, which when confused can lead to dysfunctional assessment regimes. See the Assessment Reform Group resources for a helpful summary of this. So, the reasons for assessing shapes the forms that it should take.

“It is helpful to make a distinction, here, between the intended use, or uses, of assessment data, and their actual uses. Assessments are often designed quite differently to ensure their fitness for different purposes. Or, to put it another way, results that are fit to be used for one particular (intended) purpose may not be fit to be used for another, regardless of whether they are actually used for that additional purpose. We must therefore consider carefully the ways in which assessment data are actually used. Paul Newton has identified 22 such uses.” (Assessment in Schools: Fit for Purpose?, p.7)

[p.8] “For the purposes of further discussion, this commentary simplifies current uses of assessment by clustering them in three broad categories.

  1. The use of assessment to help build pupils’ understanding, within day-to-day lessons.
  2. The use of assessment to provide information on pupils’ achievements to those on the outside of the pupil- teacher relationship: to parents (on the basis of in-class judgments by teachers, and test and examination results), and to further and higher education institutions and employers (through test and examination results).
  3. The use of assessment data to hold individuals and institutions to account, including through the publication of results which encourage outsiders to make a judgment on the quality of those being held to account.”

Improve mindsets to boost conventional outcomes

It might be that the goal is primarily to change teacher practice, in order to then see measurable impacts of a conventional sort (stronger engagement through increased attendance; higher grades; increased college admissions). Mindsets themselves are not the object to be assessed or quantified, they are a means to better outcomes that we all agree are worthy (Types 2 and 3 above).

New ideas like mindsets may need to justify themselves in terms of the current order of merit, but we should note also that they may be in tension. If learners are growing in their curiosity and creativity, they may also be less and less inclined to perform standardized tasks which have no meaning, or sit highly artificial exams. So this perhaps moves us to a deeper rationale and more transformative vision for mindset assessment.

Make mindsets a first class outcome that we want to measure

The argument for mindsets seems to me to go beyond just improving conventional outcomes, and seeks to elevate them to ‘first class’ status as a transferable, improvable set of qualities that learners will need for lifelong learning in a complex world. In education, if you can’t measure it, no-one in policy will take any notice of it as something to care about, so we need robust, evidence-based forms of assessment, and intervention strategies. This might expand mindset assessment into Type 1 above, because we want them to be an explicit object of inquiry and conversation.

If we can develop rigorous, evidence-based ways to assess them, then we could help shift the mainstream educational paradigm, and associated assessment regimes dominating schools (and post-secondary/tertiary) that are constrained significantly by the need to assess only what can be measured at scale, quantitatively. That’s the 1-line version, expanded a bit in the intro movie to a Dispositional Analytics workshop we ran at Stanford last summer.

3 approaches to mindset assessment

Let me propose three approaches to mindset assessment, which have different qualities, and tradeoffs:

AssessingLearningDispositionsMindsets

  1. Heuristics
  2. Self-Diagnostics (formal and informal)
  3. Behavioural Analytics

Heuristics

Heuristics are what we use as reflective practitioners who are learning to recognize mindsets as they manifest in learner behaviour. They’re situated and grounded in real life practice, and based on our developing understanding of what mindsets look like when they are displayed by a learner whom we know and interact with, in a context we know. We’ve learnt from talking with colleagues, reading articles, engaging in courses like DLMOOC, or perhaps going on more formal training in a specific approach. We may also have developed interventions to help coach them. The knowledge is a mix of personal and social, tacit and explicit, and the quality of diagnosis and intervention necessarily varies from one individual to another. It might be transformative, but it may also be patchy in quality, and critically, it’s not possible to evidence that growth very easily in objective, quantifiable terms—e.g. to sceptical colleagues, parents who question it, policymakers wondering if they should pay attention to this, or (gasp) even the learners themselves who want to know how they’re doing!

Self Diagnostics

Self Diagnostics are based on the premise that the learner is the world authority on how s/he experiences his or her learning, and so self-report is a legitimate window into the learner’s mind. Given the links between mindset and profoundly personal, emotional elements such as sense of identity and purpose, asking the learner how it feels for them seems a legitimate route to take.

Self-Diagnostics divide into what might be termed informal and formal.

Informal Diagnostics

…provide a representation of some sort (physical or digital) inviting the learner to express how they experience learning. A good representation externalises an interpretation of the world. The very process of designing or contributing to it can shape cognition. The affordances of a good collective representation directs the attention of the audience to emerging patterns that can then provoke reflection and conversation.

Here’s a wallchart where pupils at my local primary Bushfield School mark on a scale of 1-4 how they feel they’re doing with respect to Managing Distractions, one of about 30 dispositional ‘learning muscles’ in the Building Learning Power framework. [learn more about Bushfield’s use of Learning Power]

Screen Shot 2014-02-28 at 15.20.50

Screen Shot 2014-02-28 at 15.21.04

The screenshot below from the EnquiryBlogger tool shows the use of learner narrative in blog posts to reflect explicitly on their learning dispositions, as expressed in animals representing the 7 dimensions of learning power in ELLI (see below). [watch a talk about this]

Screen Shot 2014-02-28 at 15.55.45

Formal Diagnostics

Formal Diagnostics are provided by tools which elicit learner self-report through a Lickert scale. Example clips from such tools would include the Stanford PERTS Lab survey:

Screen Shot 2014-02-26 at 17.50.15

Mindset Works:

mindsetworks

…and the Effective Lifelong Learning Inventory (ELLI):

Screen Shot 2014-02-28 at 15.29.56

The question banks in such tools should have been validated using well established statistical and qualitative approaches used to design educational survey instruments [example]. Both PERTS Lab and the ELLI programme recognise the potential of such tools to build a large dataset [ELLI infrastructure]. Depending on the approach, the quantitative results from the respondant are then rendered and fed back to mentors, who have a repertoire of interventions to suggest in a coaching conversation.

Powerfully, learners may also receive feedback of some sort, since the whole point of mindset approaches is to catalyse change in the learner’s own dispositions, and get them to take more responsibility for how they engage. The evidence from ELLI is that immediately receiving the 7-legged visual spidergram can have a transformative effect on the learner, making visible for the first time a shape and language for talking about what were—until that point—intangible qualities.

Screen Shot 2014-02-28 at 15.30.08

While Self Diagnostics reflect what learners say about how they behave in different learning contexts, what do they actually do in practice? This brings us to the third approach to dispositional assessment.

Behavioural Analytics

Behavioural Analytics are on the frontier of learning technology research, a subset of the emerging field of Learning Analytics. Since we cannot follow students around with a camcorder recording their behavior in anything other than very small scale, expensive research experiments, this begs two questions:

  1. What kinds of learner behaviours can we track automatically, at scale?
  2. How do we map the contents of those low level (digital) logs to the high level personal qualities we care about in terms of mindsets?

The answers to these questions are only beginning to emerge. To take the first question, technologically, tracking learner behaviours is easy when they’re logged into a digital learning environment through a device: every click, page viewed, comment posted and social interaction can be logged. In a gaming or virtual reality environment, the number and variety of behaviours that can be logged explodes even further: where you go, where you look, who you meet, etc. And in the physical world, our behaviours are increasingly leaving digital traces through location and socially aware mobile devices, digital transactions, facial recognition, and the choice that an increasing number of people make to instrument their lives with other digital measures, such as exercise monitoring bracelets, sleep monitors, and so forth (the Quantified Self movement). See Erik Duval’s slides for a glimpse into this world…

Part 2 is the tricky bit. How do we go from a behavioural trace of low level system events, to an inference that a learner is getting more resilient or curious, or starting to believe that they can get better at learning?

As adaptive learning platforms become available, which closely track a learner’s trajectory through a course, and mastery of the concepts in a curriculum, the possibilities grow for inferring mindsets. Appetite for a challenge can now be quantified at a high fidelity, since the platform knows what may be at the limits of a learner’s understanding. Grit or resilience can be quantified in terms of persistence.

In this slide, I suggest some ways in which online behaviours in a given environment might be mapped to ELLI’s learning dispositions:

Screen Shot 2014-02-28 at 15.02.59

See 50:20 for my explanation of these examples:

If you want to understand what the process might look like of mapping from the user actions possible in a learning environment to dispositional qualities, in this research seminar, Shaofu Huang shared preliminary findings in our efforts to map between the SocialLearn platform and ELLI dimensions (it’s atricky process!):

Screen Shot 2014-02-28 at 16.05.09

In an educational gaming context, Valerie Shute has been developing what she calls Stealth Assessment [pdf]. The graphic below shows some of the qualities that she seeks to assess from the traces that school pupils leave in her gaming platform, which you will see overlap with an academic mindset:

Screen Shot 2014-02-28 at 16.07.37

Summary

So to wrap up, these three assessment approaches each have their own strengths and weaknesses, and future work needs to explore how each can inform or enrich the others, as well as where they may differ in what they can tell us. The Venn diagram envisages that in the future, we will combine insights from the three lenses to develop an increasingly rich learner profile, which does justice to the complex, deeply personal qualities of dispositions (they have integrity as proxies for the student’s state of mind), but which can also serve as a new form of evidence-based assessment to serve the needs of the different stakeholders in the educational system (see the ARG quotes).

I look forward to seeing how DLMOOCers are thinking about assessment, and whether the above framework is helpful, or lacking in important respects.

Dispositional, Complex Systems Analytics @ LASI

FirefoxScreenSnapz739

SkypeScreenSnapz010We’ve just got back from an intensely busy, creative and enjoyable week at LASI13, Stanford University, where Ruth Deakin Crick, Chris Goldspink, Rebecca Ferguson, Nelson Gonzalez and I (online!) contributed to the programme. Ruth and Rebecca were both blogging, so you can get their in-situ reflections, and I’m sure there’ll be more to follow as we unpack the many ideas.

SkypeScreenSnapz007Following a successful Dispositional Learning Analytics workshop, on which we’ve had very positive feedback, awareness should grow of Learner Dispositions (or Mindset as Dweck calls it), Teacher Dispositions, the potential of a multi-level Complex Systems approach to educational system design, and from Nelson’s work at Declara, the emerging potential of applying machine learning techniques to such datasets.

FirefoxScreenSnapz771

Ruth and Rebecca also participated in a panel on analytics for “21st century skills”, in which they introduced the wider audience to the learning power framework, a systems approach, and a particular implementation of these ideas as visual analytics within EnquiryBlogger. (Just in: we have a fresh set of data from a very exciting deployment of these ideas in a primary school over the last month, which we look forward to reporting on!)

Microsoft WordScreenSnapz083All the morning sessions from Stanford are, or will soon be replayable, and Simon coordinated a global network of LASI-Local institutes, and the online participants, whose blogs are automatically gathered in the LASI-Aggregator, and provide a rich account of a momentous week.

lasi-groupImmediate thoughts on next steps were pooled from Stanford and global online participants in a Google Doc, to give you a feel for where people felt they’d got to by the end of Friday. Ruth documented and reported back on her breakout group’s discussion.

Watch the Society for Learning Analytics Research for news of where next, and get your head going for how you might contribute to LAK14 conference next Spring…

FirefoxScreenSnapz770

LearningEmergence team at Stanford July 1-5

LASI-banner

The LearningEmergence team are prepping for the upcoming Learning Analytics Summer Institute, Stanford University (July 1-5). We’ll be running one of the workshops, exploring with colleagues from Stanford and many other places the emerging topic of Dispositional Learning Analytics.

Speaker lineup, abstracts, discussion papers and resources

Tune into the LASI plenaries via the online webcasts/replays, and connect with the global network of LASI-Locals running in parallel!

LASI-Local-banner