Monday, 17 October 2016

Troubled Families - anatomy of a policy disaster

[IMPORTANT DISCLAIMER: This blog represents my personal view only: not that of NIESR or of the evaluation consortium led by Ecorys]

People sometimes ask me why I spend so much time correcting the inaccurate or misleading use of statistics by politicians and newspapers. After all, they say, it’s just politics and spin, mostly – a Minister getting her facts wrong doesn’t necessarily mean anything for real people or actual policy.  Perhaps I should calm down and focus on what's really happening, not the public statements.  

However, with the publication of the evaluation of the Troubled Families Programme (TFP), we have a perfect case study of how the manipulation and misrepresentation of statistics by politicians and civil servants – from the Prime Minister downwards – led directly to bad policy and, frankly, to the wasting of hundreds of millions of pounds of taxpayers’ money.The findings of the evaluation are set out here - those interested in the detail of the TFP should read the synthesis report, produced by a consortium led by Ecorys and including NIESR, or at least the Executive Summary of NIESR’s National Impact Study. But, after trawling through literally hundreds of regressions, the bottom line is quite simple:
The key finding is that across a wide range of outcomes, covering the key objectives of the Troubled Families Programme - employment, benefit receipt, school attendance, safeguarding and child welfare - we were unable to find consistent evidence that the programme had any significant or systematic impact.  The vast majority of impact estimates were statistically insignificant, with a very small number of positive or negative results.  These results are consistent with those found by the separate and independent impact analysis using survey data, also published today, which also found no significant or systemic impact on outcomes related to employment, job seeking, school attendance, or anti-social behaviour.  
In other words, as far as we can tell from extensive and voluminous analysis of tens of thousands of individual records, using data from local authorities, DWP, HMRC, the Department for Education, and the Police National Computer, the Troubled Families Programme had no impact on the key outcomes it was supposed to improve at all. It didn’t make people more (or less) likely to come off benefits .To get jobs. To commit fewer crimes. And so on. And, just to rub it in, these findings were confirmed by an entirely separate evaluation, conducted by another research organisation, which used a different data set and a different methodology to come up with essentially the same answer – no measurable impact on any of the key outcomes.


But the key point here – and the indictment of politicians and civil servants - is not that the TFP didn’t achieve what it set out to do.  That's unfortunate of course. But successful policymaking requires experimentation and risk-taking – and by definition, sometimes that results in failure. If new programmes never failed to deliver the promised results, that would show government was not taking enough risks. That is should not be the issue. Indeed, many social policy experts thought that the basic principles underlying the programme made a lot of sense.   The point is that it was the government’s deliberate misrepresentation of the data and statistics that led to badly formulated targets, which in turn translated into a funding model that could have been designed to waste money. Bad stats meant bad policy.


And yes, I (and others - I'd note in particular Ruth Levitas and Stephen Crossley) told them so.  In February 2012, I explained the fundamental flaw in the analysis – that the government was taking a set of families who were undeniably poor and disadvantaged, and redefining them – without a shred of evidence – as dysfunctional and antisocial.  I said:



What began as a shortcut taken by civil servants with the data was translated into a speech by the Prime Minister that simply misrepresented the facts. That in turn resulted in sensationalist and misleading headlines; the end result, more likely than not, will be bad policy.

Did they stop, listen, and think? No. Instead they chose to translate an obviously flawed analysis, constructed for the purposes of a speech, into local level targets and funding.  Four months later, I wrote:


Even leaving aside the morality of using the language of "stigmatising" with respect to a set of families many of whom neither deserve nor will benefit from any such thing, this is a terrible way to make policy.  Using data - and a completely arbitrary national target number - that everyone knows are simply wrong, solely because it would be embarrassing to admit a mistake, will make the programme less effective and risks wasting public money.  Not only does it reflect badly on Ministers, it also does no credit to the senior civil servants who allow the publication of information which - at the most charitable - appears to reflect a complete lack of understanding of the relevant data.   This is a clear case for the National Audit Office. 

We can now skip ahead three years.  It was at this point, in March 2015, that Ministers decided to pre-empt the result of the evaluation, claiming that:


More than 105,000 troubled families turned around saving taxpayers an estimated £1.2 billion

This was untrue.  And we – including the civil servants responsible for the press release - knew it at the time, as I pointed out:


We have, as of now, absolutely no idea whether the TFP has saved taxpayers anything at all; and if it has, how much.  The £1.2 billion is pure, unadulterated fiction.

But it was worse than that.  As Stephen Crossley observed, anyone who actually bothered to read the CLG report in detail would have realised that the TFP targeting and funding model was - just as I had predicted - resulting in huge misallocations of money:


Manchester (for example) have identified, worked with and turned around a staggering 2385 ‘troubled families’. Not one has ‘slipped through the net’ or refused to engage with the programme. Leeds and Liverpool have a perfect success rate in each ‘turning around’ over 2000 ‘troubled families. By my reckoning, over 50 other local authorities across the country have been similarly ‘perfect’ in their TF work. Not one single case amongst those 50 odd councils where more ‘troubled families’ were identified or where a ‘troubled family’ has failed to have been turned around.

Commenting on Stephen's analysis, I said:


In other words, CLG told Manchester that it had precisely 2,385 troubled families, and that it was expected to find them and “turn them around”; in return, it would be paid £4,000 per family for doing so. Amazingly, Manchester did precisely that. Ditto Leeds. And Liverpool. And so on.  And CLG is publishing these figures as fact.  I doubt the North Korean Statistical Office would have the cheek.

At this point, it should have been blindingly obvious to the most casual observer that TFP was not - as the government had claimed - a "payments by results" programme. Numbers which had absolutely no basis in any objective reality had first become the basis for targets, then for claimed "success", and then for money.  It wasn't payment by results. It was make up the results as you go along. And cash the cheques.  The results of the evaluation should hardly come as a surprise.  

As a postscript, it's worth noting the caveats in NIESR's evaluation, which state, carefully and correctly, that issues with data quality mean that:


the results cannot be taken as conclusive evidence that the programme had no impact at all

This is quite true. But CLG's attempt to use this as an excuse for their failures conforms perfectly to the classic definition of the Yiddish term "chutzpah"(cheek or audacity in English) - "the man who kills his mother and father, and asks the court for mercy because he's an orphan". The data quality issues are entirely the result of the decision by the government to press ahead with spending hundreds of millions of pounds on an untested, unpiloted programme, on the basis of little or no evidence, rather than piloting it and/or rolling it out in such a way that a more robust data collection and evaluation strategy would have been possible. 


So whose fault is this sorry saga?  The senior civil servant who directed the Troubled Families Programme, Louise Casey, once said



If No 10 says bloody 'evidence-based policy' to me one more time, I'll deck them.
With David Cameron, who appointed her to this role in 2011, we can be pretty sure no physical violence was required.  Nothing he told her interfered with her instincts to press ahead, and to ignore both the evidence and the warnings from me and from others.  


So it starts at the top. But while most of the blame rightly rests with Ministers, including the former Prime Minister, and the responsible senior civil servants at CLG - and they should be held accountable – it is also important to note that the normal checks and balances that should have picked up on all this simply failed. What on earth did the Treasury think it was doing allowing public money to be squandered like this?  Were they busy cutting core local authority budgets to notice that CLG were throwing hundreds of millions of pounds away with no serious scrutiny? 


Nor was it just civil servants. Parliament didn’t do any better. Where was the National Audit Office? As far as I can tell it produced one, frankly mediocre report, that fudged or buried the key points - which were all by this time in the public domain.  The key Parliamentary Committees - the Public Accounts Committee (PAC) and the CLG Committee? They too were asleep at the wheel. Opposition parties and MPs do not appear to have ever raised any of these points, even though they had ample opportunity to do so.


On Wednesday, when the PAC holds a hearing – Ms. Casey will be testifying – they will have a chance to redeem themselves by asking some of the questions that should have been asked years ago. You can already read the written evidence – I’d particularly highlight 

that from Stephen Crossley.

What lessons can we learn from this?  Most obvious is the one I started with. Statistics and facts do actually matter. They translate directly into policy, and hence into real outcomes for real people.  But for that to happen in the way that it should – and not be distorted by politicians on the way – then the process not just for the production but for the analysis and interpretation of statistics and evidence needs to be genuinely independent. I set out some modest proposals here.  

11 comments:

  1. Yes v good blog but please give a nod to case workers who bust a gut doing v difficult work

    ReplyDelete
    Replies
    1. True. The qualitative evaluation has plenty of examples of good work, good practice etc. My criticism is of the targets/funding model.

      Delete
  2. As far as I'm aware, no one has pointed out the role of Emma Harrison, owner of A4e, in this story. Her "family champions" idea, and the prominence given to it by David Cameron, spurred the glorification of what was otherwise a sensible initiative, already taken by several councils; to make a single officer the conduit for the many services involved with some families.

    ReplyDelete
  3. In practice, at the local level, Troubled Families is mainly a contract management mechanism. It requires data submissions so that certain levels of funding can be secured. The troubled families themselves are not necessarily getting any different type of support, or indeed, any more or less support than before. The funding arrives, gets mixed with other funding sources, and then local authorities make their decisions on how to manage family services. On top of this, each family (parent 1 + parent 2 + kids) gets an average of £3,000 per year, not necessarily more than one year in a row, so there is only so much impact that can be expected, probably less than 'turning lives around'.

    ReplyDelete
    Replies
    1. Yep - as someone who works for a local authority in London and has some closeness to TFP your assessment is entirely consistent with my experience. Its a highly convoluted means of funding Council's children's services departments

      Delete
  4. Good blog. Confirms that Cameron was an opportunist, eager to grab a headline at any cost. I think it's fair to say that when we have a catastrophic failure in government policy it can often be linked to failure to implement evidence based policy... and that may well be because the political cycle means that a government would rather implement a half-baked policy to grab headlines now rather than test it for two years and potentially allow a future administration to fine tune it, roll it out nationally and take credit for it. Isn't democracy wonderful?

    ReplyDelete
  5. I can pull quotes from reports too...

    "Given the quite major limitations imposed by data quality (see Chapter 3 in Bewley, et.
    al., 2016), the results of the administrative data analysis in isolation cannot be taken
    as conclusive evidence that the programme had no impact at all"

    "It was not possible to obtain robust estimates on all outcome measures. Most notably,
    the comparison group for the administrative data analysis provided a poor match on
    one of the two school attendance measures, and the results are therefore inconclusive
    regarding the impact of the programme. "

    " it should be
    noted that large scale impacts would not necessarily be anticipated for the Troubled
    Families Programme. Whilst the Troubled Families Programme was underpinned by
    a national framework and outcomes, it was managed through 152 local change
    programmes, with considerable discretion afforded to local authorities in how they
    identified, prioritised and worked with their families"

    ReplyDelete
  6. The moment TF programmes moved from quality (intensive Family Intervention work) to quantity & PBR "matching" the writing was on the wall. PBR also created perverse incentives not to work with the most challenging families least likely to meet the success criteria. The numbers game also saw caseworkers having to step away prematurely from families who needed sustained support

    ReplyDelete
  7. I've just come across this blog. It's fantastic. I work as an analyst for a local authority.

    Will you put your modest proposals on this blog (rather than behind the FT paywall)? I would be fascinated to hear them.

    ReplyDelete
    Replies
    1. Extract below:

      But there is a line between putting a particular interpretation on statistics and misrepresenting them, and the current government makes a habit of straying over it. Both Prime Minister David Cameron and his deputy, Nick Clegg, have claimed that the government is paying down the national debt when, of course, it is doing no such thing. The prime minister took a statistic estimating the number of families suffering from severe deprivation, and then described them as “neighbours from hell”, guilty of antisocial behaviour and worse, although the original source said no such thing. Worse still, the same statistic was then used to allocate large sums of public money.

      Worst of all has been the behaviour of the secretary of state for my old department, who has been reprimanded by the independent UK Statistical Authority for pre-empting the release of official data by briefing selected newspapers with a message that the statistics, properly analysed, contradict. There is no meaningful sanction for this kind of abuse. A rap on the knuckles from the UKSA does not make up for the damage done. Neither does a belated, grudging correction in the offending newspapers.

      What is the answer? Some have suggested restricting the access ministers have to official statistics in advance of their public release; this is long overdue, but would not deal with the wider problem. Perhaps the government should be held to account more directly by parliament, but even that would not fundamentally change the political calculus.

      Better still, those who are responsible for government statistics should not be working for ministers. Create within each department an independent statistical and analytical unit. The difference with the status quo would be twofold. First, at the same time that data were passed to ministers, they would be published on an official website independent of the department concerned. Everyone would have equal access to the facts, at the same time. Politicians would do their best to apply spin. But they would not have the chance to tamper with the ball.

      Second, the independent number crunchers would be expected to comment publicly on the interpretation placed on their material by politicians and the media; especially when that crossed the line between half truth and outright lie.

      The rough and tumble political debate about numbers and data would continue, vigorously and uncensored. But the playing field would be levelled. And for the first time there would be a referee empowered to blow the whistle when there is a foul.

      Delete