Research Methods for Information Research

1. Why look at research methods?

Why should the nitty-gritty details of different techniques and applications be of interest, except to professional researchers? There are at least three reasons why we should all be asking what exactly is going on when particular research approaches are adopted.

Information versus knowledge

As we amble into the ‘Age of Information’ the traditional apparatus of peer review, competition for publication in recognised prestigious journals and with reputable publishers, applying commonly accepted standards and protocols for presenting research results and the rest are being swamped by the anarchy of the Internet. This makes life increasingly difficult: it is hard to sort the wheat from the chaff without the strong hints offered by the ‘traditional’ research publication industry.

In hailing the benefits of electronic access, too many people have failed to hold onto the key distinction between information and knowledge. Whether we are arguing that Internet access creates whole new virtual libraries for students or that knowledge management will rule the world, we must recognise what kinds of information we are dealing with and how wide is the gap between instant information and painstakingly acquired knowledge.

The challenge now being faced by the information community is that students (and in the age of lifelong learning this can mean all of us) now need better information-processing skills to understand the plethora of information on tap. They need to be better than previous generations at selecting and rejecting information and at making sense of what they have located. A vital element here is being able to gauge the soundness of any research report. Does it stand up as evidence? How reliable is the information and how far is it safe to generalise from the evidence offered? All these questions come back to what research methods are being used and how they are being applied (and reported).

Evidence-based working

Governments are showing great interest in evidence-based policy and practice, building on work conducted in North America and the UK on evidence-based health care.1 Underlying this approach is an aspiration to move clinical and other health decision-making from a basis of what was learnt by the practitioner in qualifying, supplemented by their later experience and innate prejudices, to reliance on the latest and most complete available research. (There is also an implicit element of rationing in this approach that Governments may find attractive – “If you can’t yet prove that it works you can’t spend money on it.”)

Will this approach work in other areas? Plenty of people evidently think so. Manifestations of their interest include creation of a Centre for Evidence Informed Policy and Practice in Education by the University of London Institute of Education at the behest of the UK Government, support for evidence-sifting centres by research funding councils and establishment of evidence-based policies and centres by Governments.

Interestingly, the health version of evidence-based practice has resulted in workshops being run for consultants, trainee doctors, nurses and many other health professionals on assessing research evidence. Questions about the methods used and how these affect the results are well to the fore. It will be fascinating to see whether the same pattern of information skills training for practitioners will be adopted in other areas.

Measuring the performance and impact of library and information services

Governments are showing ever more interest in performance measurement as a means of control. There is also growing practitioner interest in the scope for evaluating the impact of libraries and information services in ways that move beyond assessing the efficiency of services through performance indicators and targets. This question of impact indicators again raises strategic policy concerns, since many Government agencies are committed to broad sweep ‘qualitative benchmarking’, partly based on the work of the European Foundation for Quality Management’s Business Excellence Model.2

Our experience working in these issues with education and public library senior management teams is that they resolve into two significant problem areas. The first problem is how to evolve an approach which works in teasing out performance and impact indicators that fit what people are trying to achieve. After that, it is back to research methods – in this case the methods required to gather reliable and meaningful baseline data to set targets and then to gather performance and impact information to evaluate progress.

If you want to look more closely at impact evaluation, especially in relation to libraries and information services – try our book on the subject!3

1. See: RICHARDSON, A., Jackson, C. and Sykes, W. Taking research seriously: means of improving and assessing the use and dissemination of research London: HMSO 1990 and later publications from the NHS Executive.

2. Described at the Achieving Effective Performance Management in the Public Sector Seminar, University of London, QMA Public Policy Seminars, London, October 1999.

3. MARKLESS, S. and STREATFIELD, D.R. Evaluating the impact of your library London: Facet Publishing 2006 ISBN 13: 978-1-85604-488-2.