Learning Analytics for 21st Century Competencies?

safariscreensnapz484We’re delighted to announce a Special Section of the Journal of Learning Analytics, published this week, focusing on the challenge of Learning Analytics for 21st Century Competencies. In our editorial we introduce the nature of the challenge, and after summarising the different researcher and practitioner papers, propose a complex systems approach which takes seriously the ‘layers, loops and processes’ of learning infrastructures and the iterative relationship between the human and the digital, where people learn at the nodes of networked flows of information.neural_network_by_rajasegar-d2xx3w9

Learning analytics is an emerging field powered by the paradigm shifts of the information age. Pedagogy and learning that produce students capable of thriving in conditions of complexity, risk, and challenge by taking responsibility for their own learning journeys, and using technology and analytics to scaffold this process is  at the heart of the challenge. It is an emergent field, still struggling to find its way. These papers represent a unique ‘window’ into this programme from the viewpoint of both users and researchers.

You can enjoy full access to all the articles, since JLA is an open access journal.

I gave an overview of the topic and some of the papers in the above volume in this talk to the Asian Learning Analytics Summer Institute, with thanks to Yong-Sang Cho and the LASI-Asia team for the kind invitation…

SPECIAL SECTION: LEARNING ANALYTICS FOR 21ST CENTURY COMPETENCIES

Learning Analytics for 21st Century Competencies

Simon Buckingham Shum, Ruth Deakin Crick

Towards the Discovery of Learner Metacognition From Reflective Writing

Andrew Gibson, Kirsty Kitto, Peter Bruza

An Approach to Using Log Data to Understand and Support 21st Century Learning Activity in K-12 Blended Learning Environments

Caitlin K. Martin, Denise Nacu, Nichole Pinkard

Understanding Learning and Learning Design in MOOCs: A Measurement-Based Interpretation

Sandra Kaye Milligan, Patrick Griffin

Practical Measurement and Productive Persistence: Strategies for Using Digital Learning System Data to Drive Improvement

Andrew Edward Krumm, Rachel Beattie, Sola Takahashi, Cynthia D’Angelo, Mingyu Feng, Britte Cheng

Analytics for Knowledge Creation: Towards Epistemic Agency and Design-Mode Thinking

Bodong Chen, Jianwei Zhang

Tracking and Visualising Student Effort: Evolution of a Practical Analytics Tool for Staff and Student Engagement

Robin Paul Nagy

Marks Should Not Be the Focus of Assessment – But How Can Change Be Achieved?

Darrall G Thompson

Scaffolding deep reflection with automated feedback?

We’ve all got used to the idea that computers can understand writing and speech to some degree — Google adverts that match your search queries… asking Siri simple questions… IBM Watson winning Jeopardy! But how does natural language processing fit into learning that goes beyond getting the right answer to a focused question, or matching some key concepts?

Language is clearly front and centre in the way that we learn from others, share our understanding, and narrate to ourselves. However, the idea that computers have any substantive contribution to make to the teaching and assessment of writing elicits strong reactions from educators, and understandably so (learn more from this workshop).

In recent work that we’ve been doing at University of Technology Sydney, we’re exploring to what extent text analytics could support the integrative practices of writing that many are now using to help students reflect on their learning. This is particularly germane to future-oriented pedagogy that seeks to help the learner narrate their own learning journey in relation to their identity (how does my learning change who I think I am becoming?), place them in authentic (often professional) learning contexts (on the ward; in the classroom; in a company), and so forth.

As we explain in this article, there are many pedagogical reasons for valuing deep, academic reflective writing, but it poses significant challenges to educators to teach this well, and to students for whom this is often a new genre of writing. Below is the presentation we gave earlier this year. Current the Academic Writing Analytics (AWA) web tool is available only within UTS as we pilot it, but in the future this may become available to other universities, and beyond that, to schools interested in collaborative research.

Buckingham Shum, S., Á. Sándor, R. Goldsmith, X. Wang, R. Bass and M. McWilliams (2016). Reflecting on Reflective Writing Analytics: Assessment Challenges and Iterative Evaluation of a Prototype Tool. 6th International Learning Analytics & Knowledge Conference (LAK16), Edinburgh, UK, April 25 – 29 2016, ACM, New York, NY. http://dx.doi.org/10.1145/2883851.2883955 Preprint: http://bit.ly/LAK16paper

Gonski? Let’s get serious about school improvement

Good news: the Prime Minister is reconsidering his government’s decision not to fund the remaining educational reforms recommended in 2011 by the Gonski Report. However, the depressing track record of so many school improvement efforts was highlighted last week when new education minister Simon Birmingham noted:

We need to acknowledge that state and federal governments have ploughed lots more money into schools in recent years and with all of that extra money we haven’t necessarily seen improved educational outcomes” […] “There’s far more to getting better outcomes than just putting more money on the table.”

And he’s right. New work from Tony Bryk, President of the Carnegie Foundation for the Advancement of Teaching, in Australia this week, shows just why educational improvement efforts so often fire blanks — but also, how his team is building the capacity of education leaders and practitioners to address complex educational problems, rigorously evidenced.

Improvement Science uses disciplined enquiry and analysis to inform ‘on-the-ground’ change efforts, adopting rigorous protocols for testing improvement ideas in practice.  In this way, leaders’ and practitioners’ ‘learning by doing’ accumulates through rapid prototyping, into practical field knowledge capable of producing quality outcomes.

Improvement Science has targeted deep-seated weaknesses in the US public school system, serving similarly challenged demographics to priority groups in Australia, with very encouraging results on student developmental mathematics, student agency, and new teacher retention and effectiveness. An equally important outcome from this work is learning how to initiate and sustain such improvement processes, a new approach that we are now initiating in Australia, based on our schools work over the last decade in the UK and internationally. Tony Bryk is the leading figure in the Educational Improvement Science movement, and is in Australia this week giving public lectures and running Masterclasses for educational leaders in Adelaide, Melbourne and Sydney, with support from NSW and SA Departments for Education, university education departments, and and leaders from diverse schools.

This deeply engaged way of working with practitioners in the trenches is also profoundly challenging for universities to scale sustainably. Academics are used to short-term collaborations with partner schools — for as long as the next grant lasts — from which emerges academic knowledge but rarely an intentional effort to deliver practical tools or capacity-building for the school. Models of systemic innovation diffusion — and we mean two-way traffic between the universities and schools — point to the potential of strategic partnerships with educational consultancies who can scale and localise educational innovation and staff development in ways that universities struggle to deliver. This is a learning journey for universities, as well as for policymakers, school leaders and students.

Here’s the closer argument for the Minister and his team to consider. What would count as an approach that not only takes seriously the best educational research, but is committed to translating this into practical approaches for schools, and leverages the networked power of “the cloud + crowd” to share rigorous evidence of successes and failures? The furrows in the playing field of educational inequality certainly won’t be levelled by a new steam roller driving through tougher standards. You don’t help someone grow merely by measuring them more often at higher resolution. Veterans in the field know that sustainable improvement comes from growing learning partnerships with school leaders, teachers, students, parents and local community. Easily said, but that takes a holistic, systemic mindset. It might even take a bit longer than the next election.

When it does come to quantifying impact, how do we do this intelligently, with integrity? It can be tough to gather good evidence in the daily routine of school, and teachers rarely bring expertise with research methodology and tools to gather quality data. Short of having your own personal team of academics on hand to support your school, how do you innovate in a disciplined way, at scale, with evidence, sharing successes and failures on the way? Worryingly, the Gratton Institute reports systemic weaknesses in schools’ capacity to gather and use effective progress data. The bigger flaw they point to is the blinkered dependency on very high stakes, disturbingly stressful and infrequent national tests, a poor diagnostic for an improvement strategy.

So for us, the question of demonstrating impact begs the question what kinds of learners are we trying to create? Ultimately, it is the assessment regime that drives what goes on in the classroom and how schools (and pupils) are judged. Universities must also bear responsibility for escalating the ATAR-arms-race that drives such behaviour in schools. For this reason, at UTS we often have to ‘de-program’ many first years out of their ATAR-egos, to explain that to really grow as learners, they are going to have to develop skills and dispositions that are not encouraged by high stakes exams. This drives our priority on using the power of learning analytics to provide rapid feedback loops for innovation, to develop not only literacy and numeracy — critical as they are — but the new student qualities needed to thrive in turbulence and complexity, and the new teacher qualities required to transform their practice.

Alternatively, we can let educational academics continue to generate the research outcomes that define the academic pecking order, while school improvement efforts continue to struggle — and wonder in another few years why the new dollars didn’t seem to make a difference.

Simon Buckingham Shum is Professor of Learning Informatics, and Director of the Connected Intelligence Centre, University of Technology Sydney

Ruth Crick is Professor of Learning Analytics & Educational Leadership, School of Education and Connected Intelligence Centre, University of Technology Sydney

Chris Goldspink is Co-Founder and Chief Scientific Officer of Incept Labs Sydney

Tony Bryk Australian tour kicks off

FullSizeRender

Ruth, Chris and I are delighted to be hosting Tony Bryk this week, with colleagues at a series of events in Adelaide, Melbourne and Sydney. Tony is the leading figure in the Educational Improvement Science movement, and we look forward to stimulating conversation with colleagues at  the NSW and VIC Departments for Education, and leaders from diverse school contexts.

Learn more. . .

FullSizeRender-1

Universities: core business (and analytics) in 2030?

Screen Shot 2015-02-23 at 6.31.38 pm Ruth and I have the privilege of working with Randy Bass [blog] and team at Georgetown University. Randy is a leading thinker  around the deep purpose of higher education, and how this entails rethinking student qualities, and analytics.

Jump to 40mins for his closing comments in this keynote envisaging higher ed in 2030. Here’s the gist:

Our calling as a university is the formation of men and women (but many institutions do this of course). However, we do so in the context of a community of enquiry and knowledge creation (fewer institutions do this). Moreover, we do so for the public, common good (fewer still have this explicit mission). These three are interlocked and inseparable.

The railroad companies who thought they were in the business of railroads went bust. The ones who thrived understood they were in the transportation business.

What’s our equivalent?

Let’s call it Formation.
Or Transformation.
Or Integration.

But if we think we’re in the business of Content, Skills or Information Transfer, then by 2030, we’re going to have a LOT of competition.

…or, as we might say, Dead In The Water.

His Formation by Design (FxD) initiative is defining the contours of this new landscape, and their progress report is an inspiring read (disclosure: it includes material from our contributions to a symposium last June). Or check out the video roundtable discussion series he hosted called Reinvent University for the Whole Person. He was also on the team of (what I think is) the largest national ePortfolio initiative in higher education, a reflection of the importance being placed on reflection for transformational learning.

Randy and team: all power to you as we figure out together how we redefine our calling, to help students find theirs. Along the way, lets reinvent the environments and metrics that will constitute the new evidence base in 2030 🙂