Network for Evidencing Transformative Learning


Network for Evidencing Transformative Learning

The Learning Emergence team is collaborating in setting up a Networked Improvement Community for enhancing Students’ Resilient Agency. Working initially with schools in Sydney, Melbourne and Adelaide, there are already aspects of this Network for Evidencing Transformative Learning (NETL)  in place, aligned to school improvement priorities.


The (NETL) brings together teachers, school leaders and researchers to design and develop a Networked Improvement Community with the shared purpose of improving pedagogy to enhance students’ resilience in their learning and life narratives.

School as a learning community PPT Slide - HG - 07.02.2011

  • Leaders’ learning will focus on enabling teachers to engage productively in professional enquiry tuned to school improvement.
  • Teacher learning will focus on pedagogies that support students’ resilience in learning and in life narratives.
  • Student learning will be conducted through authentic enquiry.

Objectives of the NETL

To understand and evidence the processes which contribute to the development of resilient agency in learning and life narratives for young people in formal and informal learning contexts

To systematically model and represent these processes in such a way that the data can be used for self-evaluation and improvement at all levels of the learning organisation – students, teachers, leaders, system leaders.

To understand and develop both the social and technical resources that are necessary for sustainability in networked improvement communities.

The distinctive contribution of the NETL is that it will provide a means for schools to holistically and systematically improve their capability in pedagogies that nurture student resilience and agency. The project draws on improvement science and is positioned within a framework of participatory decision making in which key stakeholders collaboratively determine and systematically evaluate key processes and outcomes, rather than being determined by external regulation. The engine for improvement is the disciplined learning journeys of all stakeholders at all levels. This will provide a foundation for scaling up and sustainability.

Why does it matter?

  • There is a growing body of evidence that standards are plateauing in many schools and that current approaches to change and improvement have taken us as far as they can
  • Current approaches are over-simplistic and predominantly managerial, encourage shallow learning and are not fit for an effective education in the 21st C; instead, schools should be seen as complex, adaptive systems that support deep learning across the whole community
  • Several long-standing, complex issues continue to hamper further pupil progress, e.g. the negative impact of key transitions; slow progress in embedding best practice about teaching and learning within and between schools; the long ‘tail’ of underachievement
  • International evidence that the best education systems (in terms of sustained improvement) focus on the quality of teacher recruitment, professional learning and the development of outstanding, learning-centred leadership to ensure that pupils become resilient agents of their own learning, supported by parents and carers
  • Head teachers are constantly challenged to focus on day-to-day operational issues rather than essential strategic, values-related and learning-focused activities improvement at scale, driven by stakeholder purpose.
  • Well being and resilient agency is a challenge for all schools: Promoting physical and mental health in schools creates a virtuous circle reinforcing children’s attainment and achievement that in turn improves their wellbeing, enabling children to thrive and achieve their full potential.

Social Organisation of the NETL

Researchers will be ‘critical friends’ with key leaders in schools. We will build directly on the experience and expertise in Improvement Science of colleagues at the Carnegie Foundation for the Improvement of Teaching in San Francisco and the experience of the Learning Emergence Network ( Since we are concerned with the whole system, we will co-design interventions from three viewpoints: leaders, teachers and students. A key principle is that all members of a learning community take increasing responsibility for collaboratively leading their own learning and change:

  • Leaders will pilot visual mapping tools to gain insight into the complex dynamics of their schools as whole systems, in order to inform strategic planning, personal development and organisational learning
  • Teachers will engage in authentic enquiry & professional learning aimed at developing learner-centred practices, in order to develop professionally as facilitators of learning
  • Students will engage in personalised learning through authentic enquiry which enables them to self asses and develop their learning power and enables them to progressively take responsibility for their own learning journey — in and out of school

Schools will have the opportunity to use state of the art learning technologies for supporting critical enquiry, coaching, social learning and knowledge mapping. Evidence will be gathered throughout the three-year project providing feedback for schools and data for researchers.

Virtual Organisation of the NETL

The Hub of the Networked Improvement Community will be based at the Connected Intelligence Centre at the University of Technology Sydney. The rapid prototype interventions designed by key stakeholders will be supported and scaffolded through a framework of tools that facilitate feedback for self-directed improvement whilst allowing for each site to design contextually specific prototypes. Three key platforms form the technical ecosystem for the project which supports its social organisation:

The Surveys for Open Learning Analytics Platform

Set up through crowd sourced funding by the Learning Emergence Network, SOLA will provide rapid feedback of data for users, including students, which can be used for improvement whilst at the same time capturing raw quantitative data for researcher analysis of the impact of the user led interventions. The Crick Learning for Resilient Agency Profile (CLARA) is a key tool provided through this platform. Other tools and resources developed by the network will be available via SOLA’s identity management system.


A Social Learning Platform

for sharing and annotating learning resources.

An Evidence Hub

for pooling and evaluating learning from the rapid prototyping across the project, linking researcher and practitioner evidence.

A Network of Networks

Each leading stakeholder in the NETL (a school or learning centre) will itself be a node within a wider network. With close support from the NETL Hub each school would be invited to engage in three inter-related programmes.

  • A two year professional learning programme, aligned to AITSL standards, which embeds teacher learning in the work of the NETL
  • A structured retreat programme for system leaders
  • An Australian Research Council funded Linkage Research Project (under review)

These programmes are distinct and inter-related, designed to build capacity in the leadership of each stakeholder group to continue and improve after the project is completed. A network of networks will develop supported by the social and technical resources at the Network Hub. In this way we expect to build capacity and sustainability and demonstrate inclusivity.

We are actively fundraising for this programme through formal research bids, crowd-sourcing, philanthropic and corporate sponsorship.

Learning Power: new research identifies Mindful Agency as central to resilience

For learning in the complex world of risk,  uncertainty and  challenge, what matters is being able to identify, select, collect, collate, curate and collaboratively re-construct information to suit a particular purpose. This is why there has been a sustained and growing interest in learning dispositions and the personal qualities people, teams and communties need to flourish. As Edgar Morin says:

edgar morinWe need a kind of thinking that reconnects that which is disjointed and compartmentalized, that respects diversity as it recognizes unity, and that tries to discern interdependencies. We need a radical thinking (which gets to the root of problems), a multidimensional thinking, and an organizational or systemic thinking

Ruth Deakin Crick 2015After fifteen years of experience in the research and practical application of learning power using a survey tool called the Effective Lifelong Learning Inventory (ELLI), Professor Crick, one of the originators, led the research team in a thorough review and reanalysis of the data.  Supported by the Learning Emergence Network of international researchers, the results are now published for the first time in the British Journal of Educational Studies:

Ruth Deakin Crick, Shaofu Huang, Adeela Ahmed Shafi & Chris Goldspink (2015): Developing Resilient Agency in Learning: The Internal Structure of Learning Power. British Journal of Educational Studies. DOI: 10.1080/00071005.2015.1006574. Open Access Eprint:

Interestingly, the support for this re-analysis came from the Systems Engineers in the Engeering Faculty at the University of Bristol  as part of the International Centre for Infrastructure Futures, rather than ELLI’s original home in the Graduate School of Education….where Crick, Broadfoot and Claxton began in 2000.  Perhaps Morin would have something to say about this — we think so!

The new self assessment tool, called the Crick Learning for Resilient Agency Profile (CLARA) identifies Mindful Agency as a key learning power dimension — which predicts the set of active dimensions: Creativity, Curiosity, Sense-Making and Hope & Optimism.   Two distinct Relationship dimensions measure Belonging and Collaboration.  Finally, an Orientation to Learning indicator measures a person’s degree of Openness to change — in contrast to either fragile dependency or rigid persistence.

Internal Structure of LP with simplied view 19 August

The new measurement model represented by CLARA resulted from a detailed  exploration of the patterns, relationships  and interdependencies within the key constructs through structural equation modelling (diagrammatic summary above).  It is a more robust, parsimonious measurement model, with strengthened research attributes and greater practical value. The research  demonstrates how the constructs included in the model link to the wider body of research, and how it serves to integrate a number of ideas that have hitherto been treated as separate. For more details from a user perspective see  Introducing CLARA.

The CLARA model suggests a view of learning that, after Siegel is:

an embodied and relational process through which we regulate the flow of energy and information over time in order to achieve a particular purpose.

Learning dispositions reflect the ways in which we develop resilient agency in learning by regulating this flow of energy and information. They enable us to engage mindfully with challenge, risk and uncertainty and to adapt and change in a way which is positively alinged with our purpose.

Resilient Agency is our capacity to move iteratively between purpose and performance, utilising our learning power and generating and re-structuring knowledge to serve our purpose.

Learning JourneyLearning, from this viewpoint, is a journey which moves between purpose and performance – to put it another way, without having purpose we’re not really going to learn in a context of complexity and information overload. To learn, when the outcome is not known in advance (which is most real world learning) requires that we are able to navigate learning as a journey, utilising our Mindful Agency, restructuring information to achieve the outcome we need.

BlueThe Learning Emergence Network has teamed up with eXplorance Blue, one of the world’s leading survey providers based in Montreal, to create the SOLA platform (Surveys for Open Learning Analytics) which can host CLARA and other assessment tools, and importantly, provide rapid feedback to users for improvement purposes.

Visual feedback to the learner from CLARA

The rapid analytic feedback to users who complete the questionnaire is returned in the form of a spider diagrame which forms a framework for a coaching conversation which can move between learning identity and purpose and the formulation of strategies for change.  The new assessment tool is a focus for research and development around the world. Crick and Buckingham Shum are now based in the pioneering Connected Intelligence Centre and the School of Education at the University of Technology Sydney, where CLARA forms part of a research programme into dispositional learning analytics — alongside other learning analytics approaches designed to make visible – to learners and educators – the dynamics of lifelong learning qualities.

by-nc-nd (1)CLARA, and the knowledge and know-how in the research paper, have been made available for research and development under the terms of a Creative Commons Attribution-NonCommercial-NoDerivatives License. This permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is properly cited, and is not altered, transformed, or built upon in any way.

We welcome all contributions to the ongoing research and development of this work which has applications in education, industry and community.  We have translated CLARA into Chinese, Russian and Spanish. For more details and opportunities for collaborative research and development please contact

Carnegie Summit on Improvement in Education

I’ve just returned from this conference at which I gave a paper reporting on our proof of concept study in applying hierarchical process modelling to school improvement: Evaluating Wider Outcomes of Schooling, the ECHO project. The paper reporting the project has just been accepted for publication in Educational Management, Administration and Leadership.

Improvement Science – Systems Architecting

The theme of the conference was Improvement Science. In the world of Engineering and infrastructure, this would be called ‘systems architecting’. It was about a holistic, rigorous approach to improving organisations as complex systems, engaging all stakeholders in defining purpose, analysing the system, defining a measurement model, rapidly prototyping improvement strategies, whilst harnessing collective intelligence and ‘learning our way forwards’.

There were thought provoking keynotes, on Improvement Science, Lessons from Improvement in Healthcare and Resilience. There were over 1000 people present mostly from education, and a mixture of researchers and leading practitioners. It was inspiring to see and feel a new community of enquiry grow as we shared our bright spots and learned from our failures. Learning Together was a key theme and we were convinced that together we can achieve more than any one of us can alone.

Summit Highlights

Throughout the Summit, attendees reported key messages and actionable takeaways via Twitter. The event’s energy  and key ideas have been captured through a Storify.  The closing highlight video can be viewed on the participant portal and additional phoos from the event are on  the Carnegie  Facebook page.

Session Materials

If you are looking for materials from any of the Summit breakout sessions, they are available online at the participant portal. Log in using the credentials below to view and download PDFs of session presentations and handouts. In addition, videos from the keynotes have been uploaded for your viewing .

Password: Summit2015

Learning to Improve

The Summit saw the launch of Learning to Improve  which is  a key  resource if you want to engage more deeply in improvement science. Additional copies can be purchased through Harvard Education Press or any major book distributor. If you would like to use the book as a text for a class, please contact Carnegie directly

Learning Analytics + NICs for Systemic Educational Improvement


Personal reflections on 2 workshops and a lecture with Tony Bryk (Carnegie Foundation for the Advancement of Teaching), hosted last week by Ruth Deakin Crick at University of Bristol. What follows after a brief introduction to the concept of NICs, are my thoughts on the intersection of NICs with Learning Analytics. I made a number of connection points between the features of the DEED+NIC approach, and learning analytics, which I’ll highlight in green.

Screen Shot 2014-05-29 at 22.38.40

with Gulzhan AzimbayevaHarvey GoldsteinRuth Deakin CrickTony Bryk

Screen Shot 2014-05-28 at 12.16.29
A slide from Tony Bryk’s presentation

Introduction: NICs

The ideas of the human-centred computing pioneer Douglas Engelbart ( run like DNA through my work, I find so much depth of insight [see his Afterword to my book]. Doug showed the world in the 1960s many of the features that we now take for granted in our personal computing: the mouse, windows, hyperlinks, videoconferencing, direct editing of text on screen.

However, his work on making computers more intuitive as personal tools for thought was just part of his bigger vision for improving what he called our Collective IQ — humanity’s capacity to tackle “the complex, urgent problems” we face by working more effectively together.

The concept of the Networked Improvement Community (NIC) came from this work:

Screen Shot 2014-05-29 at 18.21.51

“A” represents how the organization or community goes about its core business or mission; “B” represents the process by which it improves its core business activity (through the efforts of individuals and improvement communities); an Improvement Alliance is a “C” activity; “C” is any activity that improves “B” activity. By definition, improvement communities operate at the B and C levels. Conversely, any time more than one person is involved in a B or C activity, it’s an improvement community. An important function of “C” is to network improvement communities within and across organizations, forming a C level improvement community, aka “C Community” or “Improvement Alliance” of representative stakeholders from a variety of B activities. Organizations can also join forces at the C level to create a more robust C function, forming a super Improvement Alliance.

Many people have explored and trialled this concept, experimenting with a range of technologies and ways of working that are designed to make evidence-based advances on complex problems. Eductional examples of particular relevance include the Carnegie Foundation’s DEED methodology and Alpha Labs, University of Bristol’s dispositional analytics research programme, the Learning Emergence network and its Evidence Hub, and the many Collaboratories for distributed research communities.

Tony Bryk used this alternative figure to show how “B” improvement clusters seeking to improve frontline “A” activities can themselves network to create a level “C” NIC:

Screen Shot 2014-05-28 at 12.18.08

Bryk and Gomez have documented the rationale behind their educational improvement science strategy in detail [pdf]. Their concept of an educational NIC cannot be applied to any collective, but comes with some distinctive features which I summarized as follows in the Edu-NICs workshop I ran:

Screen Shot 2014-05-29 at 18.24.43

Scavenging from healthcare improvement science

In his workshop and public lecture, Tony Bryk described how he has ‘scavenged’ as much as he can from the healthcare profession’s adoption of improvement science, which apparently about 25 years ago was where education stands today. It turns out that it has taken two and half decades’ concerted effort by the US Institute of Health Improvement (IHI) to establish a new professional discipline, working on translating innovation into scaleable practice. Healthcare shares with education a very similar gulf between academic scientific research and its reward systems, and the translation of insights into scaleable practice on the frontline.

Bryk pointed us to the work of Atul Gawande, who concluded his TED talk (18:10):

“Making systems work” is the great task of our generation for health, education, climate change, poverty…

This vignette from a UK Health Foundation movie shows how the concepts such as practice-based learning, collective intelligence and evidence-informed practice are now embedding, although, of course, nobody is declaring “mission accomplished”. Swap the words maternity care/hospitals/doctors/patients with education/universities/educators/learners — and it still all makes sense:

Bryk’s call to action is that within education, there is precious little of this systematic, systemic, intentional improvement methodology to be found. Education is still stuck where healthcare was, with a fixation on the scientific paradigm for truth, grounded in randomized controlled trials. Instead a new methodological paradigm is needed whose core question is not simply What works? in an isolated context, but How do we replicate and scale what works? across contexts. This is not because we can hope for ‘one size fits all’ solutions — quite the opposite — because we understand how important certain contextual factors are to the embedding of that innovation:

Screen Shot 2014-05-27 at 16.29.04

Analytics implication? By extension, the challenge of improving education applies to learning analytics (which are after all, new kinds of tools for supporting different kinds of pedagogy and assessment). Learning analytics faces the same challenge of bridging the gulf between academic research and frontline practice, and generalizing findings. As success and failure stories in the field emerge, there is exactly the same need to try and understand the contributing contextual variables. A distinguishing feature may be that learning analytics contains the seeds for its own success, in this regard, since computational and statistical approaches to identifying the most predictive variables from large datasets could be used to advance the field’s own Level C learning — not just the learning of the students being tracked at Levels A and B.

Implications for ICT

Moving towards thinking about opportunities for ICT to add value, I summarised a set of functional roles as follows:

Screen Shot 2014-05-29 at 18.33.05

It is no coincidence that the above defines a socio-technical infrastructure not only for professionals seeking to advance their field, but also for scaffolding students in authentic, collaborative inquiry. Given the challenges we face, at many societal scales, we need to train the next generation more effectively to design inquiries, make sense of complex, heterogeneous scientific and practitioner data, from multiple perspectives and epistemic traditions, via a diversity of human and computational tools, as well as learning the skills of collaborative knowledge negotiation and community facilitation in the role of ‘hub’ catalysts.

I then stepped through this cycle, as detailed in these slides:

The remainder of this note focuses on the role of analytics.

Implications for Learning Analytics

Understanding the interplay between different levels in complex systems

In a special issue devoted to complexity science, social science and computation, colleagues documented the frontline challenges that need to be tackled in modelling complex social systems, among them, multilevel dynamics: how different levels, and systems of systems, influence each other. For computational social scientists seeking to simulate a social system formally in order to understand its structure and dynamics, this is a basic research frontier. We are not so ambitious as to want to simulate the social richness of schools or courses, but the challenge of understanding how the macro and micro shape each other is at the heart of the difficulty of educational reform, and the challenge of creating what Bryk calls “practical theories and methods” which are robust enough to make the journey from academia to the front line, negotiating all the constraints of politics and practice on the way.

The learning analytics community recognizes the different levels of data and analytics that are now in play within educational systems, but has no good accounts yet of how these influence each other. George Siemens and Phil Long introduced this diagram to distinguish learning analytics that attend to fine-grained patterns in learner behavior from academic analytics that focus on the more static demographics and periodic course outcomes of interest to strategic decision makers in institutions:

Screen Shot 2014-05-27 at 17.56.16

In my own attempt to summarise the levels, I used micro/meso/macro terminology, and hinted at how the levels may start to inform each other:

Screen Shot 2014-05-29 at 18.38.16

  • Macro-level analytics seek to enable cross-institutional analytics, for instance, through ‘maturity’ surveys of current institutional practices or improving state-wide data access to standardized assessment data over students’ lifetimes. Macro-analytics will become increasingly real-time, incorporating more data from the finer-granularity meso/micro levels, and could conceivably benefit from benchmarking and data integration methodologies developed in non-educational sectors (although see below for concerns about the dangers of decontextualized data and the educational paradigms they implicitly perpetuate).
  • Meso-level analytics operate at institutional level. To the extent that educational institutions share common business processes to sectors already benefitting from Business Intelligence (BI) methods and technologies, they can be seen as a new BI market sector, who can usefully appropriate tools to integrate data silos in enterprise warehouses, optimize workflows, generate dashboards, mine unstructured data, better predict ‘customer churn’ and future markets, and so forth. It is the BI imperative to optimise business processes that partly motivates efforts to build institutional-level “academic analytics”, and we see communities of practice specifically for BI within educational organisations, which have their own cultures and legacy technologies.
  • Micro-level analytics support the tracking and interpretation of process-level data for individual learners (and by extension, groups). This data is of primary interest to learners themselves, and those responsible for their success, since it can provide the finest level of detail, ideally as rapidly as possible. This data is correspondingly the most personal, since (depending on platforms) it can disclose online activity click-by-click, physical activity such as geolocation, library loans, purchases, and interpersonal data such as social networks. Researchers are adapting techniques from fields including serious gaming, automated marking, educational data mining, computer-supported collaborative learning, recommender systems, intelligent tutoring systems/adaptive hypermedia, information visualization, computational linguistics and argumentation, and social network analysis.

As the figure shows, what we now see taking place is the integration of, and mutual enrichment between, these layers. Company mergers and partnerships show business intelligence products and enterprise analytics capacity from the corporate world being integrated with course delivery and social learning platforms that track micro-level user activity. The aggregation of thousands of learners’ interaction histories across cohorts, temporal periods, institutions, regions and countries creates meso + macro level analytics with an unprecedented level of fine-grained process data (Scenario: comparing similar courses across institutions for the quality of online discourse in final year politics students). In turn, the creation of such large datasets begins to make possible the identification and validation of patterns that may be robust across the idiosyncrasies of specific contexts. In other words, the breadth and depth at the macro + meso levels add power to micro-analytics (Scenario: better predictive models and feedback to learners, because statistically, one may have greater confidence in the predictive power of key learner behaviours when they have been validated against a nationally aggregated dataset, than from an isolated institution).

Example: Bryk reported that their Statway developmental mathematics initiative can triple the success rate of current programmes, in half the time. However, the next step is not merely to promote its success, publish, hope others pick it up, and move on to next thing. Bryk emphasised the need to look at the variation, and ask why did one school fail dismally? What can we learn? It turned out that success was dependent on the presence of certain kinds of staff. In Improvement Science, “failure is a treasure”. That’s counter-cultural to most kinds of research where one always hopes for success, and requires a bigger frame of reference which values the understanding of contextual variables, and expects failure.

What I think we see with Bryk’s work on the DEED methodology is a mechanism by which we can build knowledge about how the micro/meso/macro layers of an educational system interact — the arrows in the figure. Since local context matters, micro-level results should be passed ‘up’ the levels in order to pool data, detect patterns, and interpret why things are breaking/working, in order to then make more effective interventions back ‘down’ in local contexts. The data explosion coming from the new kinds of micro-level learning analytics must be escalated and interrogated for higher order systemic learning, so that successful analytics interventions can be adapted and replicated for other contexts.

Seeing the system

Central to Bryk & Gomez’s conception of a NIC are shared representations, which help orient the collective to the nature and scope of the problem, candidate solutions, and criteria for success. Essentially, we’re talking about maps that help people know which piece of the jigsaw they are working on. As a collective builds common ground in language and terminology, they may be able to map the system in a way that serves as a common reference point (a boundary object in Leigh Star’s terms). One example would be:

Screen Shot 2014-05-29 at 19.07.45

In a collective intelligence NIC platform designed to support the emergence of a community aligning themselves to such a map, we would then expect that these maps can serve as navigational aids around the knowledge space:

Screen Shot 2014-05-29 at 19.08.30

This is scaffolded, for instance, in the Evidence Hub platform, e.g. click on this image to see how the Hub’s building blocks (Issues, Claims, Supporting and Challenging Evidence, People, Organizations, Projects) interconnect around a given central theme:

Screen Shot 2014-05-29 at 19.08.43

or are being tackled by location of Project and Organization:

Screen Shot 2014-05-29 at 19.08.58

As the NIC builds its knowledge, one wants to know the state of the debate, and open issues, e.g. What evidence-based claims can we make? In what context does this approach work? Who is working on this problem in a Muslim context? etc. A NIC platform should serve as an analytics hub, generating views from the aggregated data flowing to it from the many local experiments. Two examples are a Knowledge Tree and an Argument Map:

Screen Shot 2014-05-29 at 19.20.56

Screen Shot 2014-05-29 at 19.18.14

The DEED methodology introduces educational leaders to some of the most common problem structuring representations in business analysis, such as Fishbone (Ishikawa) Diagrams and Driver Diagrams.

In this method, the Fishbone is used to map how the team is defining the system to be improved, e.g.

Screen Shot 2014-05-27 at 18.21.56

Systems thinkers and engineers do of course bring a well tested armoury of representational schemes and support tools to the task of evolving a picture of the system in a participatory way. Bryk has simply found the ones shown here to be simple and effective when working with educators, but I doubt he would exclude the relevance of other schemes.

Mapping the drivers

Focal areas of such a system picture are then selected for intervention, based on the best available knowledge of what drives a desired Aim:

Screen Shot 2014-05-27 at 18.22.44

For instance, there is an Alpha Lab NIC targeting Productive Persistence in student mathematics, using the following driver diagram:


This is itself a distillation of a significant, complex research literature (identifying many variables from many survey tools) into a “Practical Theory” that practitioners can work with. Expanded slightly, it looks like this, showing candidate interventions to be tried, and the sources of evidence underpinning them:

Screen Shot 2014-05-27 at 18.28.37

Zooming in on the right hand Change Ideas column, we see candidate interventions:

Screen Shot 2014-05-27 at 18.35.55

Within the Open University, we are developing a similar approach to justifying why we think an intervention will pay off, and how it will be tracked. In the figure below, a given row in the matrix represents a student experience intervention, and the columns specified a range of metadata which included: data sources required, time windows for expected impact, who was responsible, and the behavioural measures which would be tracked in order to evidence impact (or lack thereof). As shown in the figure below, one would want the Rationale and Outcome cells in the matrix to have some backing stronger than a hunch. They could link out to a living document of some sort where we build our collective understanding of what works, what doesn’t, and why we believe this.

Screen Shot 2014-05-29 at 18.42.54

The Hub could take many forms, from an internal spreadsheet/wiki, to a Driver Diagram, possibly organized in a purpose designed knowledge-building platform like the Evidence Hub, or its descendant the Impact Map.

So, the progress we are making here, is to encourage the representation of the working theory about why certain interventions (Change Ideas) may have an impact on the desired learning behaviour. Once Change Ideas are coupled with one or more learning analytics, one has created a rapid feedback loop. This is essentially a methodology and design rationale for the selection and orchestration of analytics based on the strongest practitioner and scientific evidence available at the time to that team: this is their local collective intelligence, incomplete or possibly even wrong to start with, but being refined by being passed to higher levels of sensemaking in the NIC, perhaps borrowing from and adapting other teams’ theories: a broader, deeper form of collective intelligence.

Analytics-powered Driver Diagrams: Perimeta System Models

We have been having an extraordinarily fruitful collaboration with colleagues in the University of Bristol Systems Centre, who recognise the pivotal role that a  disposition to learn has in the design of solutions to multi-stakeholder, wicked, socio-technical problems. They have developed a methodology, algorithm and support tool called Perimeta for modeling complex systems, in the explicit recognition that uncertainty is inherent in decision-making. However, it is vital  (1) to see both supporting and challenging evidence of progress, and (2) it also pays to know what one doesn’t know.

As detailed in a Learning Emergence technical report, in which the approach was piloted in a schools context [pdf], of the many systems thinking approaches available, one of the most appropriate for supporting collaborative development and leadership decisioning in complex systems such as learning communities, is hierarchical process modeling, which has three important characteristics:

  • Visual/ effective reporting of complex ideas and information is enhanced using hierarchical mapping of processes and an ‘Italian Flag’ model of evidence;
  • Assimilating all forms of evidence – data, prediction and opinion; and
  • Facilitating access to key information required for informed discussion, innovation and agreement.

Perimeta supports collaborative development of solutions to complex problems by providing a highly visual interface for understanding complex cause-and-effect and complex evidence. Perimeta can be described as:

  • a learning analytic designed to model diverse and complex processes
  • driven by stakeholder purpose
  • capable of dealing with hard, soft and narrative data in evidence of success, failure and ‘what we don’t know’
  • a visual environment for sense-making
  • a framework for self-evaluation and dialogue

The key point to make is that this hierarchical process model is essentially a Driver Diagram in terms of Bryk’s work, a working theory of what factors contribute to desired outcomes:

Screen Shot 2014-05-29 at 18.46.47

The difference is that this Driver Diagram is ‘executable’, since HPM provides a way to aggregate different kinds of evidence being gathered at the ‘leaves’ of the branches, resulting in a kind of analytics ‘dashboard’:

Screen Shot 2014-05-28 at 11.28.06

Recognising the uncertainty inherent in most data, the Perimeta model adopts an ‘Italian Flag’ visual to represent the quality of all of the evidence and consisting of:

  • ‘Green’ representing the strength of positive evidence
  • ‘Red’ representing the strength of negative evidence
  • ‘White’ representing lack of evidence, or uncertainty (the ‘white space’ awaiting exploration)

The evidence can be sourced from many places, but must be mapped into a weighting table. For instance, to map from responses to a Likert scale survey tool, HPM uses the following:

Screen Shot 2014-05-28 at 11.30.59

So to conclude an extremely fruitful collision of research programmes, two points:

We can envisage combining Driver Diagrams sourced from the literature (cf. the Productive Persistence figure), with DDs sourced from staff practitioner knowledge about local conditions, in order to design analytics which contribute to a system-wide Perimeta model, which is used to monitor the health of the system as a whole.

Hierarchical process models such as the above provide a way to create a more wholistic set of analytics: a way to quantify the wider range of educational outcomes that institutions value, adding a systems-level view to the many kinds of micro-level analytics now being developed.

The agenda to develop a wholistic conception of the learner and citizen, and analytics fit for such a purpose, is now building momentum as a wider network of people connect with each other. I’d recommend the ongoing series of Reinvent the University for the Whole Person video roundtables as a great way to tune in. . .

Screen Shot 2014-06-06 at 10.33.08


Networked Improvement Communities: Bryk lectures Bristol 2014

‘Making Systems Work – whether in healthcare, education, climate change, or making a pathway out of poverty – is the great task of our generation as a whole’ and at the heart of making systems work is the problem of complexity. 

Prof Tony Bryk, President of the Carnegie Foundation for the Advancement of Teaching,  spent a week with people from the Learning Emergence network, leading a Master Class for practitioners, delivering two public lectures and participating in a consultation on Learning Analytics Hubs in Networked Improvement Communities  (background).  A key idea is that in order to engage in quality improvement in any system, we need to be able to ‘see the system as a whole’ and not just step in and meddle with one part of it.

This is an approach which links ‘top down’ measures of performance with a ‘bottom up’ approach to organisational improvement, including all stakeholders in understanding and analysing the problem  and developing shared ‘aims and purposes‘. Having identified a ‘high leverage’ problem for improvement and a community generated driver diagram,  attention is focused on processes which need improvement and will contribute to achieving the shared purpose. Commitment to a common measurement model  and multiple rapid prototype interventions which proceed as part of a shared network of improvement, enable the networked community to ‘learn fast, fail fast and  improve fast’.

These slides are from Professor Bryk’s public lecture…links on this page will take allow you to access some more practical slides from the Master Class.

six principles of networked improvement communities

1.  Make the work problem-specific and user-centered.
2.  Variation in performance is the core problem to address.
3.  See the system that produces the current outcomes.
4.  We cannot improve at scale what we cannot measure.
5.  Anchor practice improvement in disciplined inquiry.
6.  Accelerate improvements through networked communities.

Learning Analytics for NICS

The social learning infrastructure required for a successful Networked Improvement Community is both organisational and virtual. Learning analytics and virtual learning networks can rapidly speed up the process of sharing learning and feedback of data from prototypes and enhance the speed and quality of improvement.  A workshop on Educator-NICs: Envisaging the Future of ICT–enabled Networked Improvement Communities shared current knowledge and know how providing an exciting vision for the future of learning analytics (leading to these reflections on Bryk’s work and learning analytics).