Research Methods for Information Research

5. Other Methods

5.2 Changing your approach

Some time ago I attended a higher education research conference which included a symposium on the use of concept mapping to support the development of university staff as teachers. What was striking was how dramatically the thinking of the presenters had moved on since they had agreed to do the session and provided their outline. This change could be seen in little things like jettisoning concept mapping as the focus for their symposium in favour of introducing the ‘declarative approach’ to the development of academics as teachers! Overall, the presenters own learning curve had moved through use of concept maps for corrective intervention (focusing on what students were doing wrong), to a performative approach (based on using maps to highlight the difference between what teachers think about teaching and what they do in the lecture room) and then on to their current explorations of underlying issues of pedagogy and personal understandings using maps or other ‘symbolic modes of representation’.

Thinking about the combination of confidence, courage and self-awareness required to make such a shift led me on to thoughts about equivalent changes in the research methods arena. How often have you read an account of a project team jettisoning its methods and starting again? Not often, I suspect, but such wholesale rethinking does occasionally occur and may even be documented (or buried) in the project report.

When I worked at the National Foundation for Educational Research some years ago I was jointly responsible for securing funding for a national development project aimed at communicating to the target audiences of teachers and school managers what researchers had learnt about information skills in schools. We thought that we had an elegant and practical research design and the British Library (who funded the project) clearly agreed. We went ahead and recruited the project leader, who, after much thought, told us that the methods for delivering the programme lovingly outlined in the proposal wouldn’t work!

What should we do? What we did was to assemble the most creative group of lateral thinkers that we could identify through our research network and bring them together as a steering committee. We then put them in a room with three microphones (essential since everyone was talking at once for much of the time) and invited them to start again, which they did with relish. I greatly admired the aplomb of the British Library representative who ‘chaired’ the meeting and presided serenely over the proceedings, which, of course, resulted in the project being totally redesigned. (I now wonder whether this was really aplomb or whether he was shell-shocked.) The outcome was a highly successful project which is still having after-effects in the school libraries field.

Sometimes a shift in methodological direction may not be deliberate or the need to change may be resisted. One example of a project which called for a fundamental rethink was set in motion with national funding a few years ago to investigate the economic impact of public libraries. The methods identified for doing the work required the recruitment of a trained economist, but although such a post was specified in the proposal it did not prove possible to find anyone suitable. Instead of going back to the drawing board at this point the project went ahead, relying heavily on the project leader who had read economics at university some years earlier. Unsurprisingly, the project was less successful than it might otherwise have been.

Sometimes the inappropriate choice of research methods seems almost wilful. As an example, I was asked to comment on a social services-focussed dissertation prepared by a diligent student who carefully summarised the published research about the use of questionnaires with social workers. The clear message emerging from this work was – don’t. Since social workers are usually mired in the problems of their clients they spend much of their time filling in forms to help these clients secure attention or resources. When they are not doing this they are likely to spend yet more time accounting for their activity or otherwise contributing to the central record–keeping system. Hitting them with a questionnaire simply won’t work – which did not stop the student described here from administering her own questionnaire and achieving a derisory response rate to add further evidence that some people are questionnaire-phobic. It is vital to take into account the preferences and prejudices of your research subjects when choosing research methods and, if you get it wrong, you will have to change to other methods.

When choosing your research methods the key questions to consider are what evidence do you want to get from the research and what methods are likely to work best with the target group to get this research evidence? It would be nice to think that these questions are carefully considered by LIS researchers when putting proposals together, but the preponderance of questionnaires and focus groups as the favoured weapons of choice suggest that this is not always happening. If you are trying to engage in macro-level comparative data collection then questionnaires are likely to be appropriate, but only if the approach adopted in sampling, designing and field-testing the questionnaire, administering the survey, recording, analysing and interpreting the data is both rigorous and systematic, which is patently not the case in many questionnaire surveys.

If you are trying to gather management information to help improve an information service, carefully designed and structured focus groups might be part of the mix, if only as a way of collecting sanitised comments from a range of users rather than uncovering nests of snakes. If you really want to find out how a target group of information users actually engage with information in shaping their lives it is hard to see where either questionnaires or focus groups are likely to fit in. If you don’t agree with me about this, do by all means try these methods – but be ready to report your failure and move on to something else if they don’t really work.