Innovation Unit – Engaged Learning https://engagedlearning.co.uk Sun, 07 Jan 2018 13:14:53 +0000 en-GB hourly 1 https://wordpress.org/?v=5.1.8 Lies, Damn Lies, And Conscious Misrepresentation of Evidence https://engagedlearning.co.uk/lies-damn-lies-and-conscious-misrepresentation-of-evidence/ https://engagedlearning.co.uk/lies-damn-lies-and-conscious-misrepresentation-of-evidence/#comments Sat, 05 Nov 2016 17:19:47 +0000 https://engagedlearning.co.uk/?p=3552 Credit: S H Chambers

Credit: S H Chambers

An earlier post of mine, on ‘what counts as evidence’, generated a healthy debate, and I thought I could leave the thorny problem of ‘what works’ in education for a while. Maybe lighten the mood with a blog about the all-out assault on the judiciary in post-Brexit Britain, or what’s an appropriate response to a Donald Trump presidency, something like that.

But the ‘evidence’ issue reared its contentious head again yesterday, November 4th, so The Donald might have to wait.

The flashpoint was the publishing of a report by the Educational Endowment Foundation (the UK equivalent of America’s What Works Clearinghouse) on the impact of Project-Based Learning on literacy and engagement.

The Times Educational Supplement was the first to claim “Exclusive: Project-based learning holds back poor pupils”. And the predictable social media onslaught ensued:

It soon became abundantly clear that hardly any of the Tweeters had bothered to read the actual report. In fact, I doubt very much that many of them had even gone beyond the abbreviated version of the TES article, the fuller version of which is only available on subscription So, before we go any further, please spend 5 minutes reading the EEF summary here. Better yet, read the full report.

The summary doesn’t get off to a great start with an bizarrely restrictive definition of PBL:

“Project Based Learning (PBL) is a pedagogical approach that seeks to provide Year 7 pupils with independent and group learning skills to meet both the needs of the Year 7 curriculum as well as support their learning in future stages of their education.”

But let’s move on to the gist of the conclusions:

“Adopting PBL had no clear impact on either literacy (as measured by the Progress in English assessment) or student engagement with school and learning.” Perhaps mildly surprising as PBL is often touted – by people like me – to enhance student engagement. Otherwise, nothing to see here.

“The impact evaluation indicated that PBL may have had a negative impact on the literacy attainment of pupils entitled to free school meals. However, as no negative impact was found for low-attaining pupils, considerable caution should be applied to this finding.”

I confess I’ve read this statement repeatedly, and I’m still none the wiser. Are free school meals students therefore not ‘low-attaining’? If considerable caution should be applied, why draw the conclusion in the first place?

This was the focus of the TES headline. Their journalist clearly read the full EEF report, but didn’t think it their responsibility to draw attention to the caveat ‘overall, the findings have low security…47% of the pupils in the intervention and 16% in the control group were not included in the final analysis. Therefore there were some potentially important differences in characteristics between the intervention and control groups. This undermines the security of the result. The reason that so many pupils from schools implementing PBL are missing from the analysis is largely due to five of these schools leaving the trial before it finished. The amount of data lost from the project (schools dropping out and lost to follow-up) particularly from the intervention schools, as well as the adoption of PBL or similar approaches by a number of control group schools, further limits the strength of any impact finding.”

So, almost half of the schools had to drop out (for reasons which we’ll come to in a moment). Furthermore, the intervention period was meant to be two years but, due to funding constraints, this was halved. This is significant because most PBL experts agree that it takes 3-5 years before teachers really feel confident delivering a different pedagogical approach, and can therefore expect high quality student outcomes, whereas one can assume that the control group of schools have been working in their ways for some considerable time.

And if that’s not enough to cause doubt, consider this from the report:

“ for some that did not get allocated to the intervention group, faithfully adopting the control condition was seen as being detrimental to their pupils’ learning and therefore some of these schools chose to implement a version of PBL anyway.”

(Full disclosure: I can confirm this took place. Although I’m a Senior Associate at the Innovation Unit, who managed the trial, I was not part of the team. I do, however train schools in the use of Project-Based Learning. I discovered at the end of a training event that the school I’d been training were part of the control group and were intending to introduce PBL to their students. Contamination of evidence is anathema to educational researchers and further undermines the usefulness of the conclusions.)

Another central tenet of randomised control trials is that like is compared with like: if you want to know whether an intervention will improve literacy scores, you should have schools that were comparable before the intervention began. According to the Innovation Unit blog post in response to the report: “8 of 11 schools in the study were ‘Requires Improvement’  (an OFSTED categorisation indicating cause for concern) or worse (national average is 1 in 5)…The control group were stable in comparison, with 8 of 12 schools Good or Outstanding” .

So, here’s the nub of it: the newspaper article claimed that the report demonstrated that “FSM pupils in project-based classes made three months’ less progress in literacy than their counterparts in traditional, subject-based lessons.”

But this was based upon literacy data from schools – 80% of which would have already had poorer test scores than most of the control group schools. Is it possible, therefore, that the kids tested might have actually been further than 3 months behind their control group counterparts, without the PBL intervention?

Let’s use a simpler analogy in order to highlight this flaw: If you feed me anabolic steroids for a year, I’ll still come in a long way behind Usain Bolt in a 100m race. Surely a more sensible test would have been to see if my own personal times improved after the ingestion of drugs, not whether I was able to beat someone already faster than me?

Some on Twitter justifiably asked why was almost half of the data missing from the intervention schools, but not the control group? It turns out there were a variety of reasons, most of which point to the difficulty of carrying out trials in the fear-driven English schools system.  5 of the 12 intervention schools withdrew during the trial. 3 had a change of headteacher (presumably in response to OFSTED judgements), 2 were taken over by academies. 1 withdrew after the trial was cut from 2 years to 12 months. One school refused to submit its students to the test, arguing (with some justification) that you can’t hope to see improvements in literacy scores 12 months after introducing PBL.. If you, as a leader, were facing future closure if results didn’t improve, would you introduce a significant change like project-based learning, or knuckle down, and overdose on test-prep? Yep, me too. So, does it make sense to study the impact on such schools?

Before I get on to the two ‘J’Accuse’ parts of this post, allow me to share a few quotes from the evaluators report:

“ it is not possible to conclude with any confidence that PBL had a positive or negative impact on literacy outcomes”

“ schools reported finding positive benefits from the programme in terms of attainment, confidence, learning skills, and engagement in class”

“ the Innovation Unit’s Learning through REAL Projects implementation processes were particularly effective for the target (that is, willing and with the capacity) schools, and the feedback from those schools was almost entirely positive.“

“The need for improving skills appropriate for further study and those valued by employers in the modern workplace (a central aim of PBL) has not diminished, but probably increased. This study picked up the value of these skills to pupils’ learning and future potential through the process evaluation, but was not able to measure these skills as an outcome.”

Were any of these quoted by the TES? Of course not. I realise that “Study Doesn’t Prove Much Of Anything” is a kind of headline more suited to The Onion than the TES, but journalists have a responsibility to provide balance, especially in the education press, as we’re living in an evidence-based era. If they don’t provide that balance, we may as well just forget about truth and see who can shout the biggest lie. The evaluators, it seems to me, did a good job in presenting a fair and balanced assessment, working within the constraints of a difficult set of circumstances, on a project that probably should have been curtailed when so many schools dropped out.

It’s just a pity that the ideological bias of our media sought to significantly misinterpret it.

Which brings me to my second accusation. I don’t think I’m being unfair when I say that, in the increasingly fractious educational debate between ‘traditionalists’ and ‘progressives’, there are a greater proportion of traditionalists who say that we should objectively look at the evidence, and make informed policy and practice decisions, on the facts, not misinterpretations.

So, it was particularly disappointing to see some of those same people jump on the TES headline, rather than reading the evaluators report. Sadly, this includes Nick Gibb, Minister of State for School Standards:nick-gibb-talks-shit-on-pbl

and Sam Freedman, Director of Teach First:

sam-freedman-talks-shit

and SchoolsImprovementNet:

3-for-the-price-of-one

And a host of others on Twitter.

Rachel Wolf, of Parents and Teachers for Excellence,  was quoted in the TES article (because I’m sure she’d read the full report), citing students doing a project on slavery, baking biscuits, as evidence that PBL taught the wrong things. I shall henceforth remember her fondly as ‘rigorous Rachel’. In their defence, the TES at least shared quotes from some of the intervention schools. One of these schools that’s presumably been failing their Yr 7 students was Stanley Park High School. In an unfortunate disjuncture, this same school had just been awarded 2016 Secondary School of the Year by…the TES. You couldn’t make it up. Oh, wait, they just did.

In my attempts to get some of the above to read the original evaluators report, in full, before exercising their prejudices, I was accused of dismissing the evidence. I’m not. I’m the first to say that the evidence based for PBL is weak (though stronger than many claim, and the evaluators have done a good job in presenting the key messages from a literature review). Many people who see very positive impacts of PBL (in extraordinary schools like Expeditionary Learning, High Tech High and New Tech Network in the USA) are very anxious to see large-scale, reliable research emerge on the breadth of impact of PBL.

But, until we have that, can we stop hysterically distorting what little evidence we have? The rising popularity of PBL around the world has enthused and revitalised legions of classroom teachers. Until we’ve got some conclusive evidence that it’s not working, can we allow them to exercise their own professional judgement, please?

]]>
https://engagedlearning.co.uk/lies-damn-lies-and-conscious-misrepresentation-of-evidence/feed/ 3
10 key lessons from the Learning Futures programme https://engagedlearning.co.uk/10-key-lessons-from-the-learning-futures-programme/ https://engagedlearning.co.uk/10-key-lessons-from-the-learning-futures-programme/#comments Tue, 04 Oct 2011 14:14:00 +0000 https://engagedlearning.co.uk/testing/?p=126

The phase of working intensively with schools on the Learning Futures programme has ended (for now) and we are busy producing tools that teachers and school leaders can use to bring about change. In our own way, we’re trying to move forward the ‘learning revolution’ that the TEDx London event recently called for – but I hope, with a sense of pragmatism, based upon the reality of schools and the structures, and strictures, they operate within.

Today, I was working on a manual we’re producing for schools, that offers support in making innovation happen within the high-stakes culture of accountability that hangs over schools. It’s been an opportunity to reflect on what we’ve all learned on the initiative. I have found it a privilege to work with the 40+ schools and the international partners – the learning has been deep and profound, and I pay tribute here to the school leaders, and teachers, we’ve learned from. Some of them are featured in this sneak-peek of one of a series of films we’ve produced on leading innovation in education:

The interviews deal with operational and cultural challenges, but all of these are secondary to the primary goal of creating pedagogical change. Changing the teaching and learning paradigm is what our project is about, and we have all come to realise how tough a challenge that can be, even in our most progressive schools. Whilst we all had different start points, and different end-goals, we saw, at our last gathering in July, that a consensus had emerged on what we’d learned about great learning experiences.

So, whilst it’s impossible to condense two year’s experiences into 10 bullet points, I’m going to try to summarise our key understandings:

  1. Getting students immersed in purposeful projects leads to engaged learning and builds relationships  – the 4 P’s of designing engaging learning activities have provided a useful check-list for learning designers (aka teachers): Placed locates the activity in the students life; Pervasive means that they can continue their learning independently, when there’s no teacher around; Passion-led just makes sense as students learn best when they care about the issue. Purposeful gives the learning meaning and relevance.

  2. Authentic learning experiences benefit from real world connections  – so much of what is learned reinforces the notion of School as Enclosure, detached from community and local enterprises. Ron Berger’s ‘Ethic of Excellence‘ shows how compelling real world connectivity can be. Yes, it’s a lot of extra work to cultivate sustained external partnerships, but your students will love you for it. 

  3. Simple structures and project designs facilitate complex and deep learning – and complex schedules/timetables, learning plans and assessment criteria too often lead to superficial learning. Minutely detailed individual learning plans might comfort adults who are under pressure to show student progress in every single lesson, but it often only demonstrates that the teaching is progressing – learning progress is often a series of peaks and troughs, set-backs and breakthroughs. It’s astonishing how complicated schools can become, and it’s often because their structures are there to make the organisation work for the professional’s benefit, not the student’s. Turning that oil tanker around, and keeping it simple, requires the kinds of radical vision expressed in the video clip.

  4. The expectation of quality products leads to purposeful work and deeper learning – encourage students to consider their learning as work – judged by experts in their field. Then it has relevance, meaning, and provides a sense of agency for students.

  5. Great projects can be teacher-led, student-led or product-led – but are driven by passion and real world connections – the old process/product, student/teacher ownership issues will always be with us. There’s no one path to designing great learning experiences. But designs driven by passion always carry more impact than those driven by the need to ‘cover the standards’.

  6. Students have an entitlement to feel proud of their work. Pride results from multiple drafts and  peer-critique – our Art Schools have known this for decades. Having students responsibly critiquing each other’s work  gives power and depth to their learning, and continually raises the bar of their expectations. Too much of the work students submit is not valued, (by either teacher or student) as it was their first, and only, draft. Do less, but do it better, through multiple drafts. If the task has depth and meaning for the student, they’ll want to get it right – they’ll understand that caring about the outcome means repeatedly drafting it.

  7. Engagement results from modelling exemplary student work – great schools display exemplary student work, and use those exemplars as both  the focus of discussions around ‘what makes this great?’ and a hook to inspire and engage. Too often kids begin a task without knowing what a great outcome would look like.

  8. Public displays of student work demonstrate both the product and the process of  learning, provide entry points to engage parents and community, allowing the dialogue to shift from ‘grades’ to ‘growth’ – isn’t this what we’d all want parents evenings to look like? Present the learning to parents, have them interrogate the learners, show how the learning grew, and the grades no longer dominate. Your school becomes more of a Learning Commons, too.

     

  9. Don’t atomise the timetable: fewer blocks, fewer subjects and fewer teachers lead to better relationships and deeper learning – as more schools realise the fundamental importance of relationships, so we understand that less is clearly more. Students feeling ‘known’ by fewer teachers, through longer blocks of time, working across subjects where possible, all lead to more powerful learning and happier students. 
  10. Transforming engagement and learning cannot be achieved without transforming professional development –  this is perhaps the hardest realisation of all. If you’re serious about transforming learning, you have to make the time for staff to share, discuss and learn from each other, and from external experts. For the Learning Futures coordinators in our schools, the biggest catalyst to change was simply the regular opportunity to collaborate with other schools, to be inspired by visiting experts in learning, and to have the time to share with their colleagues in school. I know of no other profession, no other industry, that would expect its employees to carry our research and development on their own time, and we really have to get serious about professional development if we’re to break through the achievement plateau we’re on.

So, we now look forward to  translating these understandings into tools that can change practice. If any of this resonates with you, and you’d like to receive the tools when published, please contact Alec Patton: [email protected]

 

]]>
https://engagedlearning.co.uk/10-key-lessons-from-the-learning-futures-programme/feed/ 2
Aspirations and Inspirations https://engagedlearning.co.uk/aspirations-and-inspirations/ https://engagedlearning.co.uk/aspirations-and-inspirations/#respond Mon, 16 Nov 2009 16:41:00 +0000 https://engagedlearning.co.uk/testing/?p=231 The Learning Futures project held its second National Event this week, in London. Each of our partner schools attended, together with their Head Teacher, and we shared stories, data and experiences after 8 weeks of implementing their learning innovations, as part of the programme.

Two months in can quite often be a wobbly time, when you’re trying to do something out of the norm, in teaching and learning. It’s often entirely natural to question oneself, and wonder whether you’re doing the right thing. So, it was immensely reassuring this week to hear from the man who is CEO of possibly the best schools in the world. Larry Rosenstock talks fast. We gave him two hours – what he packed into that time was equivalent to a week-end masterclass. Most importantly, without knowing too much about our common project themes he highlighted all of them as being cornerstones of High Tech High’s success: Enquiry/Project Based Learning forms almost the entirety of their curriculum; Mentoring & Coaching is fundamental to student support; Expanding the Locations and Partners for Learning enables all of the learning to be authentic, and students and staff Co-Construct the learning throughout the place.

Their results are spectacular: almost every student graduates and enters college, even though their intake is mixed across all social classes and abilities. Such results have attracted some pretty big hitters too. Bill Gates has been a regular visitor in recent years and, for him, High-Tech-High proves that ‘you can work hard and have fun at the same time’.

The philosophy of the school is neatly summed up in this interview with Larry:

It’s clear that H-T-H encourage students to have the highest aspiriations, and Larry is unrepentant in judging the quality of the teacher by the quality of the work student’s produce. So no ‘laissez-faire’ attitudes tolerated here. But, like, the schools taking part in Learning Futures, the pedagogy starts from an acknowledgment that nothing can be taught to students, only they can learn. And they do that best when they’re engaged. Finding, in High Tech High, an exemplar of what we are describing as the ‘four P’s’ of engagement (hands-on learning which is placed, principled, purposeful and prolonged) has reaffirmed the faith our schools have in their students.

I’d urge you to visit the digital commons on the High Tech High website – I’m confident you’ll find it as inspiring an experience as listening to Larry Rosenstock was this week.

]]>
https://engagedlearning.co.uk/aspirations-and-inspirations/feed/ 0