Best Value and Better Performance in Libraries
B: Putting the model into actions
B10: Baseline data
B10.1 Baseline data? What baseline data?
An interesting aspect of the overall process described here is that whenever we have worked through the model with education libraries, or with public or schools library service managers, it has been necessary to stop at this point. The reason is that when service managers look closely at what they want to achieve and at the sort of evidence that will tell you whether you are getting there, you usually do not have the baseline information about where you are now that will enable you to set sensible targets (e.g. what proportion of your current ICT users are unemployed or elderly? How many members of reading groups are more avid readers now than they were before they joined a group?).
When we have been involved as consultants in helping a senior management team or extended SMT to work through the process, there has usually been a pause of four to six months at this stage so that people can go off and assemble baseline information.
B10.2 Moving beyond ‘cause and effect’
Since it is difficult to establish clear ‘cause and effect’ relationships in the real world, public library service managers as well as SLS managers, may find it useful to adopt the Ofsted approach (which they use in inspecting schools and local education authorities) of looking to see whether an activity is ‘bearing upon’ or is ‘linked to’ a service aim.
A (positive or negative) link can be inferred if there is usually an improvement or decline in a situation when a specific service or activity is introduced or withdrawn. An example picked up from Ofsted LEA inspections has ‘Providing multi-media resources’ as potentially bearing upon ‘differentiation (meeting individual student needs), motivating pupils, extending learning opportunities for pupils and cost-effective resource management’.
Remember that Ofsted uses this type of evidence (drawn from such activities as lesson observation, interviews with teachers and scrutiny of lesson plans) when deciding whether schools (or LEAs) are performing at a satisfactory level. ‘What’s good enough for Ofsted is good enough for…’?
B10.3 What to measure
Some guiding points when thinking about where to focus measurement or assessment of library services are that:
- inputs usually depend on an external agency as well as library managers/staff
- processes are controlled at some level by library managers and staff (even if they are carried out by external agencies)
- outputs usually entail some element of user involvement
- impacts usually focus on welfare or well being and may be social, economic or environmental. More importantly, any library service impact will be an impact on people – to find out about impact entails asking people (whether you do this through questionnaires or talking to them).
B10.4 Data as evidence
You are gathering data in order to convince people that your service is working efficiently, your work is having an impact or that you are doing as well (or better, or worse) than other services. You may be doing this to help in your own managerial role, but you will also need to convince other people inside and outside the service.
Where you need information to help you manage the service and to satisfy elected members or the Best Value review you may have to act politically. Qualitative information is likely to be useful to you in gauging impact but you may have to supplement this evidence with less useful quantitative data (usually process data) to placate and reassure members. No, we are not saying that qualitative information is always or even usually better than quantitative information – only that it sometimes serves different purposes. Sophisticated quantitative data and good quality impact information should complement each other strongly.
An example of the value of combining qualitative and quantitative information is drawn from one of the project workshops. When unpacking their objective of ‘Supporting the curriculum by providing materials that pupils can use to widen/enrich learning’ the team identified three questions that they needed to answer in order to judge impact and various ways of finding these out:
Whether the material is relevant?
Evaluation forms
Critical incident interviews (see B10.6)
Talk to education officers/advisers
What is the optimum level use/take up?
Statistical PIs
How is material used to achieve enrichment?
Asking heads, teachers, pupils.
Both the quantitative and qualitative information are needed here, to enable careful selection of material and assess impact, but also to show the scale of the service and whether it could be justified in take-up terms.
Part of the monitoring of your own evidence will be to make statistical comparisons with other services (are we measuring the same things? Do these benchmarking club statistics tell us anything?). A helpful touchstone may be to envisage situations in which you need to convince others:
- when you are talking to the Director of Education about the public library role in supporting literacy, will the sorts of evidence you are assembling be useful to show what you are doing?
- when the first DCMS Inspector (or an Ofsted Inspector) comes to ask about your role in promoting reading, will your evidence show what you are doing?
It is not always necessary to generate your own evidence in order to argue a case or take management decisions. At present there is scant national research evidence to guide public or schools library services in making service judgements, but this is beginning to change. A growing emphasis on ‘evidence-based practice’ is working through into the LIS field and some useful studies are emerging (e.g. a survey of readers returning books in four large libraries that gives information on the reasons why particular non-fiction books are borrowed).
B10.5 Social audit
This approach has been generated by Rebecca Linley and Bob Usherwood16 to help garner qualitative information in a context where “The Audit Commission is only concerned with what is measurable and, as a result, tends to ignore a great deal of what is important”. We all know that issue figures, enquiry counts and visitor numbers do not enable libraries to demonstrate the impact that they can have on communities and individuals. However, whereas objective quantification may not be possible in areas such as social inclusion and community empowerment, the impact and contribution of library services can and must be shown clearly. The Social Audit is offered as a “tool for enabling sensible measurement of complex outcomes" and a "technique that makes the enacting process (of libraries) visible”. It is a qualitative technique, but one that, the authors argue, is none the less valid and rigorous (important in the Best Value process).
Linley and Usherwood acknowledge that, as with many qualitative techniques, it is time consuming and hard to formulate. However, if the alternative is not having clear evidence of the contribution of the service to individuals and to the community, the service will be judged on limited information that significantly underplays its value.
The Social Audit process is based on extensive and carefully focused ‘discussions’ with stakeholders – all of whom are to be identified to obtain “as full a view as possible of the social aspects of public library activity”. The ‘discussions’ take place using interviews, focus groups and workshops. Community profiling enables stakeholders to be identified and focus groups to be constructed on a valid basis.
The authors claimed that Social Audit obtains “real world data” in a “rigorous way”. It is qualitative data but is promoted as “valid evidence and – should be treated as such by politicians and professionals alike”. (p.85)
This type of data collection is not only important to Best Value and to justifying services to politicians, it is also vital to good management and service development. Output data does not provide confirmation about the degree to which the service is achieving desired results – and you need evidence and good data on performance to feed back into the planning process.
B10.6 How to collect information
Public library services have traditionally relied heavily on service performance (usually output) statistics, such as enrolments and loans, backed by the occasional questionnaire survey. Having viewed hundreds of library service questionnaires, this may be the time to offer some comments:
- many questions are too general and too bland to do more than show that libraries are well-regarded – by users
- some others over-reach themselves by trying to gather complex information that is more appropriate for interviews
- many are poorly designed – with embedded library jargon, sloppy wording and little variety of task
- very few have been adequately piloted
- most show evidence that they were produced in too much of a hurry. Constructing and testing a properly designed questionnaire will take at least one person-week.
Overall, too many questionnaires unwittingly project the designers’ preconceptions out to potential respondents and have them reflected back by nice people who are trying to second guess what is meant. Questionnaires are a good method of gathering small amounts of specific information through structured sequences of questions – when you already have an idea of what range of replies is likely and have a feel for the language with which people you are surveying usually discuss the concepts that interest you.
Focus groups have probably taken over from the questionnaire as the most misused research instrument. Many library services are using focus groups and they can be really useful – if they are carefully structured to obtain the sorts of information you want. The usual intentions are to find out people’s views on a limited set of propositions (such as service objectives) or to pick people’s brains more generally. Neither of these outcomes are readily achieved by a general round table discussion.
There are literally thousands of tried and tested group activities, including many that are specifically designed to collect and prioritise information. When trying to prioritise and discuss propositions:
- various types of card-sort activity should give you focused information (and they are fun to do). If you want to stimulate discussion about service priorities (or, if you must, specific services): generate your own set of key propositions in relation to your theme; then type and stick each proposition (about 20-24 propositions are ideal) on 5 x 3 cards or ‘Post-its’; divide people into small groups (not more than 4 people); then ask them to sort one set of the cards per group in one of several ways, of which two are described below:
- ask people to organise the cards into a meaningful pattern, shape or flowchart, adding their own ideas on blank sheets if they wish. Then de-brief each group in turn by asking them to explain what they have produced and why, picking up on points of similarity and difference between groups, or
- ask people to reject all but nine of the cards (they can add their own ideas on blank cards at this point) and arrange the chosen nine in diamond formation (the words below each represent one card):
The idea of this ‘nine diamond’ is to force prioritisation without wasting time over the ‘in-between’ rank order. Then debrief across the groups by asking what people placed first and why, what went into the second tier and why; what people threw out easily and why; and what they added in (if anything) and why.
General points:
- allow ample time (25-30 minutes) for sorting activities (including small group discussion)
- allow at least 5-10 minutes per group when de-briefing (even working across groups)
- use flip chart to catch most of the content and, if possible, get a colleague to take notes of the discussion.
If your emphasis is on picking people's brains, then two options are:
- individual brainstorming – Invite participants to think about their answers to a carefully chosen question, such as ‘What are the main steps required to provide effective support for reading promotion for [a specific community]?’ Ask them to write their replies on ‘Post-its’ (one idea per sheet) and pool the results in the middle of the table. Then invite the participants to cluster the sheets in whatever way they think works (there will be overlaps – get participants to stick these on top of each other), sorting out what the statements mean as they go along and labelling each cluster themselves (this will tell you how they describe the concept.) You can then focus discussion on each of these clusters in turn, using open questions such as ‘What are the issues or concerns here?’
General points:
- focus the discussion on the clusters and listen carefully to each person in turn
- ensure that someone is recording the discussion - you are too busy!
- Structured brainstorming (such as the Nominal Group Technique outlined here) – seat participants (ideally about 7-12 people) in horseshoe format round a table, with you and a flipchart at the end, and ask them to record their responses to your research question (as for individual brainstorming) but this time in short phrases on their own note pads. When people stop writing, ask each person in turn to offer one item to the pooled list. Write these in their words on the flip chart, numbering each one. Keep going round until people run out of ideas (usually 30-60 ideas). Be inclusive; if in doubt write it up on the flipchart. Display all the sheets as you go along.
When all the ideas are in, look at each sheet in turn and invite the group to seek clarification of anything that is too cryptic; not to say whether they agree or disagree with the propositions. Then the group may wish to link a few overlapping ideas. At this stage, invite people to choose the five most important items from the list (important for them individually, or for your service - you choose). When they have all chosen the appropriate numbers, either get them to tick their five choices on the displayed sheets, or you record their numbers in priority order (most important first - use a different coloured pen!) by writing from 1 to 5 for each person on the displayed sheets. (You may then like to weight these scores from first choice = 5 points, downwards, using another different colour, and add up the total points for each idea). You are now ready to discuss their high priority items.
General points:
- Actively discourage discussion until after the prioritisation stage.
- Timings: individual recording of ideas 5-10 minutes (this will feel like 5 hours the first time around - but you must give people time to think!); round up of ideas c.30 minutes; ranking 5 minutes; scoring (ticking or you writing) 5 minutes; ranking (offer a 5 minute break whilst you do this); discussion c. 30 minutes.
B10.7 Critical incident interviews
A powerful method of finding our what people are looking for - is to ask! Focusing on specific enquiries (e.g. reference enquiries or requests for non-fiction) and seeking to put these in context should provide a rich vein of information about how and why people use your service. Questions should cover:
- what led the person to ask for that information/resource then
- whether they sought the same information elsewhere
- whether they obtained what they wanted from any source and what it was
- ‘customer care’ questions – how they were dealt with etc.
- whether they did anything with the information
- whether there was any specific outcome (e.g. using information to argue a successful case for funds)
B10.8 Getting help
With the current emphasis on partnership working we should remember that partners are potentially in a position to collect useful information, and may be doing do for their own purposes. Some points:
- if you want people (such as teachers) to give you feedback about how resources are used by others, tell them what you would like them to look out for in advance, then ask for that information later
- some people (such as inspectors/advisers) are collecting information and writing reports that may be useful if you can tap into them
- some people (school librarians, booksellers? museums staff?) share some goals with your service and may be ready to collaborate over information gathering
- take opportunities to systematically observe aspects of your service and record the results. Be prepared to share this information with others who will observe for you.
- school librarians should be encouraged to do systematic observation for their own (and your) benefit
16. LINLEY, R. and USHERWOOD, R. C. op.cit. ↩