Friday, 29 June 2012

The consequences of ignoring the evidence [updated]

[this post was updated at 9pm on 29/6/2012 to respond to Sam Freedman (DFE)]


In justifying the decision to abolish the Educational Maintenance Allowance (EMA), the government argued repeatedly that the decision was justified by the evidence; in particular, that  EMA was ineffective in increasing educational participation among young people. For example, the Minister for Further Education, in November 2010, said
"we have focused on the evaluation evidence and other research which indicates that EMA does not effectively target those young people who need financial support to enable them to participate in learning.  It will be replaced by a scheme that does."

This was a gross misrepresentation of the evaluation evidence and research, as was pointed out at the time by many of those responsible for producing it. In a letter to the Guardian, co-signed with a number of other economists who have worked in this area, we wrote: 
"extensive quantitative evaluations of the EMA have shown that it has significantly improved both staying-on rates and qualifications for students from poorer backgrounds. Econometric evidence from researchers at the Institute for Fiscal Studies, published in 2005, found that the EMA significantly increased participation rates in post-16 education among young adults, and concluded that its impact was "substantial"; subsequent IFS research, published in 2008, showed, moreover, that the EMA significantly improved their educational outcomes.
The government has chosen to ignore this rigorous and independent evidence, and has instead argued that the abolition of EMA is justified by high levels of "deadweight" – ie that many young people in receipt of the EMA would remain in education even without it. But even if this is true, it is not a sound economic argument for abolishing EMA - it could equally be argued that the government should not vaccinate children against meningitis or polio, since the vast majority of children wouldn't contract these diseases anyway. Virtually all government programmes, even the most successful, have some deadweight cost.
 The real question is whether the benefits, economic and social, of the EMA exceed its costs overall. On this, the IFS concluded that even looking at only the narrow economic benefits of EMA – the higher wages that its recipients would go on to enjoy in future – these are likely to exceed the costs in the long run. And this takes no account of the wider social and economic benefits.  Over the long term, growth depends above all on the skills and qualifications of the workforce... Abolishing the EMA – which enables many young people to gain the qualifications that they will need in the future – is not a recipe for long-term growth."
Yesterday, the Department for Education released data on the educational participation of young people, which it describes as  "the Department's definitive measures of participation at ages 16-18...in the context of historical trends. .  The release rightly highlights the following key point:
"Participation in full-time education fell by 1.8 ppts at age 16 to 86.2%, the first fall since 2001."
This is clearly extremely bad news.  The labour market prospects of young people without decent qualifications are extremely poor, both because of current economic weakness and longer term trends. The proportion of 16 year olds staying on should be rising - as it was up to 2010, in part because of EMA - not falling.  It is too soon, and we do not have enough detail, to definitively conclude that this reverse is due to the abolition of EMA.  But the Department, and its Ministers, urgently need to tell us - with some proper evidence and analysis this time - what is going on and what they are going to do about it.  


Update: Sam Freedman, a policy advisor at the Department for Education, responded to this post in a lengthy discussion with me on twitter. You can read the whole thing if you want, but his main substantive point was this: 
"the percentage in education or training has gone up. That's the key figure..why didn't you mention [it] in your blog?"
Except that's definitely not the key figure, as far as DFE itself is concerned.  It includes participation  in "other education and training", which is mostly part-time and (contrary to his subsequent attempt at justifying his position) is explicitly not "employment-based." This doesn't seem like a great outcome for 16 year olds.  Indeed - and this is why I didn't mention it - the official DFE statistical release doesn't mention this figure in the "Headlines" section.  


What are the outcomes that DFE think matter? I reproduce in full the "Headlines" paragraph that talks about 16 year olds.

"16 year olds 
• Participation in full-time education fell by 1.8 ppts at age 16 to 86.2%, the first fall since 2001, but there was a rise in part-time education (+1.6 ppts) and work-based learning (+0.1 ppts). 
• The overall proportion in education and work-based learning increased by 0.1 ppts, although in rounded form the figure remained at 95.5%. However, due to changes in the underlying data, there was a change in the methodology to determine the size of the overlap between fulltime and part-time education and work-based learning in 2011. We estimate that without this change, the proportion of 16 year olds in education and work-based learning would have fallen by 0.1 ppts (see Technical Notes section E for further details)
• The proportion of 16 year olds NEET rose slightly, from 2.7% to 2.8%."

In other words, on all three outcomes for 16 year olds that the DFE release highlighted, things got worse.  I think it's reasonably clear who is being selective about the use of statistics.


Subsequently, when we had established this, Sam tried again to move the goalposts. First he questioned why I was focusing on 16 year olds, rather than 16-18 year olds. That's obvious of course, as he well knows; the transitional arrangements for EMA abolition mean that you wouldn't expect much impact on 18 year olds in this data.  


Finally, after we'd established that the outcomes for 16 year olds were (according to DFE's own interpretation) negative, he moved to the tradeoffs, saying:
 "at most a statistically insignificant fall. And £340 million has been saved."
Now, of course, I was careful in my original blog not to claim that direct causality had been established between EMA abolition and the deterioration in outcomes for 16 year olds.  And it is entirely legitimate to look at costs.  So if Sam's original response had been along these lines:
"Yes, these figures are disappointing and worrying. But it's not clear that the deterioration is the result of EMA abolition. And we have saved £340 million by abolition. More analysis is needed to work out what's going wrong and how - while still saving money - we can reverse these worrying trends."
then I would have regarded that as both honest and defensible.  But his attempt to choose his own facts completely undermines his position. He seems determined to make my original point for me - that the government's primary interest is in making the evidence fit the policy, rather than the other way around.  Very disappointing. 



No comments:

Post a Comment

Note: only a member of this blog may post a comment.