Research Methods for Information Research
7. Beyond research methods
7.8 Evidence-based working or methodological fundamentalism?
Library research is largely conducted from within the information research enclave, but the research community in the USA is increasingly recognising that, to have any real recognition by policy makers and government funders, it is necessary for them to venture out into the surrounding educational research territory.
The problem that this aspiration raises is that, for the research to be taken seriously by the present Federal Government it has to feature in a charming website called the What Works Clearinghouse25 established by the US Department of Education’s Institute of Education Sciences. Accustomed as we are (or as we are becoming) to evidence-based practice, this should not come as a great surprise. However, the criteria for inclusion within the Clearinghouse are, to say the least, rigid: the research reports submitted to their review process are classified in one of three categories: ‘Meets evidence standards’ – randomized controlled trials and regression discontinuity designs that meet their specific criteria (virtually impossible to achieve in any real educational research project concerned with library or information service provision) ; ‘Meets evidence standards with reservations’ – strong quasi-experimental studies that have comparison groups and meet their other evidence standards; and ‘Does not meet evidence screens’ – studies that provide insufficient evidence of causality or otherwise fall foul of the WWC criteria. The nasty trap for unwary researchers is that if you aspire to getting your research included in the second category and fail, you are then consigned to the ‘Does not meet evidence screens’ group. As a leading researcher, Keith Curry Lance, recently said “What a promising start to a postgraduate research career!”
This presents an ethical dilemma – should US researchers aspire to having library research taken seriously by central government even if the required research design is felt to reflect such a narrow conception of research that it becomes in effect anti-scientific? The views expressed by US researchers at one colloquium ranged from ‘We have to go there’ to ‘Isn’t it fortunate that library research funding hasn’t yet been polluted by these demands?’ The non-US researchers at this event were relieved that we don’t yet have to make the same choice but concerned that ‘methodological fundamentalism’ might prove to be just too attractive to our own governments, for whom simplistic solutions (however flawed) are always attractive.
In case we think that it is only in the US Department of Education’s Institute of Education Sciences that the patients appear to have taken over the asylum, it is worth looking at the paper given by Ernest House to the European Evaluation Society on ‘Democracy and evaluation’. After pointing to various fundamentalist trends in US Central Government, he claimed that:
“This authoritarian attitude has spread to other levels of [US] government … Some government agencies demand that all evaluations must be randomized experiments. Other ways of producing evidence are not scientific and not acceptable. There is one method for discovering the truth and one method only - the randomized experiment. If we employ randomized experiments, they will lead us to a Golden Age in education and social services. Such is the revelatory vision. There are sacred figures from the past and sacred texts. Core believers are absolutely certain they are right, that randomized experiments reveal the truth. They do not listen to critics, and they do not comprehend the arguments against the method. A mandate for randomized trials has been written into legislation without discussion with professional communities. Avoiding contrary ideas is part of the orientation, and, of course, the policy is enforced by government edict.”26
Meanwhile, there are signs that the educational research communities elsewhere are not giving up without a struggle. In the UK, for example, it is noteworthy that the main organisation in this field is the ‘Evidence for Policy and Practice Information and Co-ordinating Centre’ (EPPI) at the University of London. Evidence-informed practice is certainly being taken seriously here, but the approach is sufficiently nuanced to allow for a variety of research approaches.
26. HOUSE, E.R. ‘Democracy and Evaluation’ Paper presented at the 6th EES Biennial Conference 2004. www.europeanevaluation.org/conferences ↩