Are we relying on unreliable research?

An article on research by Caroline Fiennes, Giving Evidence

“Ask an important question and answer it reliably” is a fundamental tenet of clinical research. And you’d hope so: you’d hope that medics don’t waste time on questions that don’t matter or which have been answered already, and you’d hope that their research yields robust guidance on how to treat us. Does research in our sector to understand the effects of our interventions follow that tenet?

I suspect not. It’s a problem because poor quality research leads us to use our resources badly. The example of microloans to poor villagers in Northeastern Thailand illustrates why.  In evaluations which compared the outcomes (such as the amounts that households save, the time they spend working or the amount they spend on education) of people who got loans with those of people who didn’t, the loans looked pretty good. But those evaluations didn’t take account of possible ‘selection bias’ in the people who took the loans: perhaps only the richer people or better networked people wanted them or were allowed to have them. A careful study which did correct for selection bias found that in fact the loans made no difference. The authors conclude that ‘‘‘naive’’ estimates significantly overestimate impact.’

Such examples are rife. There is one in the current edition of Stanford Social Innovation Review, about a back-to-work programme, discussed here

Spotting them requires assessing research against a quality standard. Though foundations fund masses of research – through charities’ M&E, sometimes conducted by the charities themselves and sometimes done independently – to my knowledge, only one has ever assessed the quality of the research it sees. It didn’t look pretty. The Paul Hamlyn Foundation looked at the research it received from grantees between Oct 2007 and Mar 2012: only 30% was ‘good’, and even that was using a rather generous quality scale. It even found ‘some instances of outcomes being reported with little or no evidence’.

Assessing the quality of research is bog-standard in medicine and increasingly common in the public sector. The Education Endowment Foundation already does it (in its toolkit) and the government’s other What Works Centres will too. The National Audit Office (NAO) recently published analysis of the quality of almost 6,000 government evaluations, which contains a salutary nugget. Buried on page 25 is the finding that the strongest claims about effectiveness are based on the weakest research. This (probably) isn’t because the researchers are wicked, but rather because you can infer almost anything from a survey of two people: most social interventions have quite small effects, and robust research won’t let you show anything bigger.

Let’s put that the finding other way round. Charities competing for funding have an incentive to show that their impact is sizeable. The NAO’s finding implies that that is easier if they do bad research. So funders who rely on charities’ own research inadvertently encourage them to produce unreliable research.

So funders should look carefully – and independently – at the quality of evidence that we fund and use, let we be misled to ineffective work. As mentioned, the methods and tools for assessing research quality (‘meta-research’) are established and proven in other disciplines. Giving Evidence is exploring doing meta-research, to assess the reliability of research produced by charities and used by funders. We are in discussion with relevant academics who would run the analysis and with luck will get the work funded by academic sources. We would like to talk to funders who are interested in understanding the quality of the research that they commission and use with a view to improving it.  If you are interested, please get in touch.
__________

Caroline Fiennes is director of Giving Evidence, a consultancy and campaign promoting charitable giving based on sound evidence. She serves on a board of The Cochrane Collaboration, an organisation producing meta-research in health through over 28,000 researchers in over 120 countries.
She is contactable at caroline.fiennes@giving-evidence.com and is on Twitter @carolinefiennes