Prof Tony Bryk – Master Class and Public Lecture – 21st May 2014

Prof Tony Bryk (President of the Carnegie Foundation for the Advancement of Teaching).  He will be providing a one day masterclass, followed by a public lecture on the Design-Educational Engineering and Development (DEED) approach to school improvement.

Masterclass Details:

Getting ideas into action:  Integrating research and practice in networked improvement communities in education

Where:  Systems Centre, Knowledge Exchange Suite, Merchant Venturers Building, BS8 1UB

When: Wednesday 21st May 2014 09:30 – 16:00

Tony Bryk masterclass
Professor Anthony Bryk will be facilitating a Master class for leading practitioners in the Design-Educational Engineering and Development (DEED) approach to school improvement. This improvement research allows systems to address shared complex problems with all stakeholders, rapidly prototyping improvements and harnessing the emerging collective intelligence to facilitate organisational learning. The Master class will communicate the key ideas of DEED as an improvement science and will provide opportunities for participants to explore its application to particular complex issues drawn from education.

To book:  http://bristol.ac.uk/education/events/2014/1032.html

Public Lecture Details:

Getting ideas into action: designing networked improvement communities

Where:  Systems Centre, Knowledge Exchange Suite, Merchant Venturers Building, BS8 1UB

When: Wednesday 21st May 2014 17:00 – 19:00

Professor Anthony Bryk (President of the Carnegie Foundation for the Advancement of Teaching) will be presenting the Design-Educational Engineering and Development (DEED) approach to school improvement. This powerful organisational process allows the whole system to address shared complex problems with all stakeholders, rapidly prototyping improvements and harnessing collective intelligence to facilitate organisational learning and sustainable improvement.

Flyer:  Tony Bryk final

To book: http://bristol.ac.uk/education/events/2014/1031.html

 

Quantifying deeper learning dispositions for the future data cloud

The Big Data in Education workshop in George Mason University, Washington DC, is asking:

How might we, from scratch, design digital platforms to model multiple data streams from multiple sources in a generalized ecosystem of learning to make predictions about learning based on changes to instruction? We envision MOORs as digital terrains traversed by learners across formal and informal education (e.g., schooling, museums, the internet), and across the lifespan.

Slides from my intro talk, which connected Ruth and Chris’s research at Bristol University and Incept Labs, with emerging concepts of the future learner’s personal data cloud, in which I manage the release of my behavioural and somatic data to boost my learning analytics…

To learn more about his work browse this site, but specifically, our talk at LAK12, and the Dispositional Learning Analytics workshop replay.

Assessing learning dispositions/academic mindsets

A few years ago Ruth and I spent a couple of days with the remarkable Larry Rosenstock at High Tech High, and were blown away by the creativity and passion that he and his team bring to authentic learning. At that point they were just beginning to conceive the idea of a Graduate School of Education (er… run by a high school?!). Yes indeed.

Screen Shot 2014-02-28 at 16.56.56Now they’re flying, running the Deeper Learning conference in a few weeks, and right now, the Deeper Learning MOOC [DLMOOC] is doing a great job of bringing practitioners and researchers together, and that’s just from the perspective of someone on the edge who has only managed to replay the late night (in the UK) Hangouts and post a couple of stories. Huge thanks and congratulations to Larry, Rob Riordan and everyone else at High Tech High Grad School of Education, plus of course the other supporting organisations and funders who are making this happen.

Here are two of my favourite sessions, in which we hear from students what it’s like to be in schools where mindsets and authentic learning are taken seriously, and a panel of researcher/practitioners:

Week 7 coming up is focused on Assessing for Deeper Learning, and I’m very much looking forward to hearing from the panellists. So in this post, I’d like to share some thoughts about how we go about assessing what some of us call lifelong learning dispositions, while others refer to academic mindsets, and welcome comments from DLMOOCers and others.

First, why do we want to assess mindsets?

In education, assessment serves many purposes, which when confused can lead to dysfunctional assessment regimes. See the Assessment Reform Group resources for a helpful summary of this. So, the reasons for assessing shapes the forms that it should take.

“It is helpful to make a distinction, here, between the intended use, or uses, of assessment data, and their actual uses. Assessments are often designed quite differently to ensure their fitness for different purposes. Or, to put it another way, results that are fit to be used for one particular (intended) purpose may not be fit to be used for another, regardless of whether they are actually used for that additional purpose. We must therefore consider carefully the ways in which assessment data are actually used. Paul Newton has identified 22 such uses.” (Assessment in Schools: Fit for Purpose?, p.7)

[p.8] “For the purposes of further discussion, this commentary simplifies current uses of assessment by clustering them in three broad categories.

  1. The use of assessment to help build pupils’ understanding, within day-to-day lessons.
  2. The use of assessment to provide information on pupils’ achievements to those on the outside of the pupil- teacher relationship: to parents (on the basis of in-class judgments by teachers, and test and examination results), and to further and higher education institutions and employers (through test and examination results).
  3. The use of assessment data to hold individuals and institutions to account, including through the publication of results which encourage outsiders to make a judgment on the quality of those being held to account.”

Improve mindsets to boost conventional outcomes

It might be that the goal is primarily to change teacher practice, in order to then see measurable impacts of a conventional sort (stronger engagement through increased attendance; higher grades; increased college admissions). Mindsets themselves are not the object to be assessed or quantified, they are a means to better outcomes that we all agree are worthy (Types 2 and 3 above).

New ideas like mindsets may need to justify themselves in terms of the current order of merit, but we should note also that they may be in tension. If learners are growing in their curiosity and creativity, they may also be less and less inclined to perform standardized tasks which have no meaning, or sit highly artificial exams. So this perhaps moves us to a deeper rationale and more transformative vision for mindset assessment.

Make mindsets a first class outcome that we want to measure

The argument for mindsets seems to me to go beyond just improving conventional outcomes, and seeks to elevate them to ‘first class’ status as a transferable, improvable set of qualities that learners will need for lifelong learning in a complex world. In education, if you can’t measure it, no-one in policy will take any notice of it as something to care about, so we need robust, evidence-based forms of assessment, and intervention strategies. This might expand mindset assessment into Type 1 above, because we want them to be an explicit object of inquiry and conversation.

If we can develop rigorous, evidence-based ways to assess them, then we could help shift the mainstream educational paradigm, and associated assessment regimes dominating schools (and post-secondary/tertiary) that are constrained significantly by the need to assess only what can be measured at scale, quantitatively. That’s the 1-line version, expanded a bit in the intro movie to a Dispositional Analytics workshop we ran at Stanford last summer.

3 approaches to mindset assessment

Let me propose three approaches to mindset assessment, which have different qualities, and tradeoffs:

AssessingLearningDispositionsMindsets

  1. Heuristics
  2. Self-Diagnostics (formal and informal)
  3. Behavioural Analytics

Heuristics

Heuristics are what we use as reflective practitioners who are learning to recognize mindsets as they manifest in learner behaviour. They’re situated and grounded in real life practice, and based on our developing understanding of what mindsets look like when they are displayed by a learner whom we know and interact with, in a context we know. We’ve learnt from talking with colleagues, reading articles, engaging in courses like DLMOOC, or perhaps going on more formal training in a specific approach. We may also have developed interventions to help coach them. The knowledge is a mix of personal and social, tacit and explicit, and the quality of diagnosis and intervention necessarily varies from one individual to another. It might be transformative, but it may also be patchy in quality, and critically, it’s not possible to evidence that growth very easily in objective, quantifiable terms—e.g. to sceptical colleagues, parents who question it, policymakers wondering if they should pay attention to this, or (gasp) even the learners themselves who want to know how they’re doing!

Self Diagnostics

Self Diagnostics are based on the premise that the learner is the world authority on how s/he experiences his or her learning, and so self-report is a legitimate window into the learner’s mind. Given the links between mindset and profoundly personal, emotional elements such as sense of identity and purpose, asking the learner how it feels for them seems a legitimate route to take.

Self-Diagnostics divide into what might be termed informal and formal.

Informal Diagnostics

…provide a representation of some sort (physical or digital) inviting the learner to express how they experience learning. A good representation externalises an interpretation of the world. The very process of designing or contributing to it can shape cognition. The affordances of a good collective representation directs the attention of the audience to emerging patterns that can then provoke reflection and conversation.

Here’s a wallchart where pupils at my local primary Bushfield School mark on a scale of 1-4 how they feel they’re doing with respect to Managing Distractions, one of about 30 dispositional ‘learning muscles’ in the Building Learning Power framework. [learn more about Bushfield's use of Learning Power]

Screen Shot 2014-02-28 at 15.20.50

Screen Shot 2014-02-28 at 15.21.04

The screenshot below from the EnquiryBlogger tool shows the use of learner narrative in blog posts to reflect explicitly on their learning dispositions, as expressed in animals representing the 7 dimensions of learning power in ELLI (see below). [watch a talk about this]

Screen Shot 2014-02-28 at 15.55.45

Formal Diagnostics

Formal Diagnostics are provided by tools which elicit learner self-report through a Lickert scale. Example clips from such tools would include the Stanford PERTS Lab survey:

Screen Shot 2014-02-26 at 17.50.15

Mindset Works:

mindsetworks

…and the Effective Lifelong Learning Inventory (ELLI):

Screen Shot 2014-02-28 at 15.29.56

The question banks in such tools should have been validated using well established statistical and qualitative approaches used to design educational survey instruments [example]. Both PERTS Lab and the ELLI programme recognise the potential of such tools to build a large dataset [ELLI infrastructure]. Depending on the approach, the quantitative results from the respondant are then rendered and fed back to mentors, who have a repertoire of interventions to suggest in a coaching conversation.

Powerfully, learners may also receive feedback of some sort, since the whole point of mindset approaches is to catalyse change in the learner’s own dispositions, and get them to take more responsibility for how they engage. The evidence from ELLI is that immediately receiving the 7-legged visual spidergram can have a transformative effect on the learner, making visible for the first time a shape and language for talking about what were—until that point—intangible qualities.

Screen Shot 2014-02-28 at 15.30.08

While Self Diagnostics reflect what learners say about how they behave in different learning contexts, what do they actually do in practice? This brings us to the third approach to dispositional assessment.

Behavioural Analytics

Behavioural Analytics are on the frontier of learning technology research, a subset of the emerging field of Learning Analytics. Since we cannot follow students around with a camcorder recording their behavior in anything other than very small scale, expensive research experiments, this begs two questions:

  1. What kinds of learner behaviours can we track automatically, at scale?
  2. How do we map the contents of those low level (digital) logs to the high level personal qualities we care about in terms of mindsets?

The answers to these questions are only beginning to emerge. To take the first question, technologically, tracking learner behaviours is easy when they’re logged into a digital learning environment through a device: every click, page viewed, comment posted and social interaction can be logged. In a gaming or virtual reality environment, the number and variety of behaviours that can be logged explodes even further: where you go, where you look, who you meet, etc. And in the physical world, our behaviours are increasingly leaving digital traces through location and socially aware mobile devices, digital transactions, facial recognition, and the choice that an increasing number of people make to instrument their lives with other digital measures, such as exercise monitoring bracelets, sleep monitors, and so forth (the Quantified Self movement). See Erik Duval’s slides for a glimpse into this world…

Part 2 is the tricky bit. How do we go from a behavioural trace of low level system events, to an inference that a learner is getting more resilient or curious, or starting to believe that they can get better at learning?

As adaptive learning platforms become available, which closely track a learner’s trajectory through a course, and mastery of the concepts in a curriculum, the possibilities grow for inferring mindsets. Appetite for a challenge can now be quantified at a high fidelity, since the platform knows what may be at the limits of a learner’s understanding. Grit or resilience can be quantified in terms of persistence.

In this slide, I suggest some ways in which online behaviours in a given environment might be mapped to ELLI’s learning dispositions:

Screen Shot 2014-02-28 at 15.02.59

See 50:20 for my explanation of these examples:

If you want to understand what the process might look like of mapping from the user actions possible in a learning environment to dispositional qualities, in this research seminar, Shaofu Huang shared preliminary findings in our efforts to map between the SocialLearn platform and ELLI dimensions (it’s atricky process!):

Screen Shot 2014-02-28 at 16.05.09

In an educational gaming context, Valerie Shute has been developing what she calls Stealth Assessment [pdf]. The graphic below shows some of the qualities that she seeks to assess from the traces that school pupils leave in her gaming platform, which you will see overlap with an academic mindset:

Screen Shot 2014-02-28 at 16.07.37

Summary

So to wrap up, these three assessment approaches each have their own strengths and weaknesses, and future work needs to explore how each can inform or enrich the others, as well as where they may differ in what they can tell us. The Venn diagram envisages that in the future, we will combine insights from the three lenses to develop an increasingly rich learner profile, which does justice to the complex, deeply personal qualities of dispositions (they have integrity as proxies for the student’s state of mind), but which can also serve as a new form of evidence-based assessment to serve the needs of the different stakeholders in the educational system (see the ARG quotes).

I look forward to seeing how DLMOOCers are thinking about assessment, and whether the above framework is helpful, or lacking in important respects.

Research Metro Map

At the Festival of Postgraduate Research event held on Friday 21/02/2014 at Bristol, we presented a metro map of our research interests.

Not surprisingly there was a great deal of overlap in our research areas – which has been plotted on the metro map. Although it provided visitors to our stall with a great overview of our research interests, it was more important to us, as a research centre, to see the relationships between us.  What was interesting was being able to observe the emergence of a dialogue as we looked at the map on the day, not only in terms of shared interests but in the potential links.

metromap

We intend to continue the discussion to develop this map further in our seminar series over the next few months.

Copies of our individual research posters can be in the blog post Bristol Festival of Postgraduate Research 

 

 

 

Complexity, modelling and infrastructure futures: embedding learning in future business models

Professor Brian Collins is leading a multi-institutional research project the ‘International Centre for Infrastructure Futures’. A team from the University of Bristol is leading on the ‘learning framework’ for the business models which will enable us to learn from the projects and from each other. Here, Professor Brian Collins introduces the project and the challenges we face.

The deficiencies of our traditional, fragmented, sectoral approach to infrastructure creation and management are becoming more apparent as poorly appreciated interdependencies make infrastructures vulnerable to service disruptions and cascade failures. These lead to capacity limitations, inefficiencies, poor reliability, low adaptability and missed opportunities, which are further exacerbated by poor data and information about interdependencies in and between sectors at the organisational, commercial and policy levels. The resulting disjointed approaches to governance and policy create perverse incentives and conflicting actions, which then stifle collaborative innovation. ICIF is a new way of bringing together the stakeholders involved in renewing the UK’s infrastructure to exploit multidisciplinary, systemic thinking about infrastructure interdependencies when developing the novel business models needed to address future challenges. The Centre will develop a principled and generic learning process for creating innovative, performance based business models to exploit infrastructure interdependencies. For Professor Collins recent keynote at INCOSE follow this link.