Wednesday, 20 January 2016

Student Views - running the Library Questionnaire

During two weeks at the end of November I ran the first of a new, annual Library Questionnaire with our students. At the moment our students don't really engage with the Library beyond using it as a study space. Few resources are borrowed, very few information enquiries are made and there is little course integration, for example through resource lists or inductions. The purpose of my questionnaire, therefore, was to gauge awareness, gain opinion and collate data that I could use with teaching staff and College management to help drive forward improvements. I also included a couple of questions supplied by COLRIC (Council for Learning Resources in Colleges) so that I can benchmark myself against other institutions.

My experience of questionnaire planning, delivery and analysis with the LRC team in my previous post had prepared me well for going it alone. So what had I learnt?
  • Each question must have relevance. It might be you want to find out something specific you can't get from your other statistic gathering. It might be you need to obtain some evidence to include in a review or proposal. Or you might want to use it to help raise awareness of a resource/service which then provides year-on-year data demonstrating your promotional success (or lack of!). Either way, if you cannot specify exactly how you will use the results then it is not a relevant or useful question.
  • Don't ask about something you can't change. If you know that there is no
    budget/space for new computers in the immediate future, for example, don't ask students whether they think there are enough. Instead, ask them whether they can always access one and then you can look at introducing or changing booking allowances. What's the point of asking if you can't do anything about it?
  • Encourage negative comments. It's great to know if you're doing things well but in order to make effective improvements you need to find out what your students aren't happy with. When applying an agreement scale to a question, for example, start with Strongly Disagree. That way students have to think about whether there is anything they're not happy about before they reach the options to agree.
  • Encourage details and give provision for follow-up. If you ask whether students want new resources it's not very helpful if you get an anonymous response asking for "more resources for biology". (Particularly if you have biology courses at several different levels.) Unfortunately, I was unable to go through my questionnaire with students as they completed it, encouraging further details, and so left myself open to ambiguous responses like this. It is therefore important to ask for more information about the student themselves (aside from any institutional equal opportunities monitoring), e.g. which course they are studying and what year they are in. This then allows you to follow up with subject tutors to find out about topics and resource requirements in that area.
  • Analyse your respondent demographic. As well as identifying trends across the questions as a whole it is also interesting to analyse your responses at a deeper level, looking at who your students are and how they responded to other questions. For example, are there response groupings, i.e. several responses of the same type coming from the same year group or course? This could then help you identify priorities in resource provision or promotion without the need being specifically requested in other parts of the questionnaire.
I have completed my first phase of analysis, identified the 'headline' results and drawn up an action plan. Despite planning well I did encounter some issues.

I had to leave students to their own devices to complete the questionnaire. The Library is
a silent study environment so I couldn't go through the questions with students. There are two other study areas in the College and I regularly replenished questionnaires there to increase responses. However, because I am a team of one, I was unable to spend a lot of time in these areas talking to students. The benefit of this is that students feel they can be more honest. However, this means that in future years I won't be able to guarantee a certain number of responses. Furthermore, I found that not all questionnaires were completed the whole way through and some respondents didn't take it very seriously. (I feel confident that the requests for me to change the Library into a spaceship, install a wood burner and swimming pool and introduce a trolley service I can safely ignore!)

To encourage responses I also created an online version using Survey Monkey and this raised issues of its own. Firstly, due to the limitations governing the number of questions you can have in the free part of Survey Monkey, some questions had to be arranged slightly differently and this may have affected the responses. More worryingly, despite setting the majority of questions as requiring an answer there were still several instances of questions being skipped. In spite of these issues I received 208 questionnaires back (121 paper and 87 online) - almost half of our student population - and so received enough full responses to make the results meaningful.

Having developed an action plan the final stage is to let your students know what you are doing. You asked them for their opinions so it is only right that you tell them what their contributions have resulted in. Inspired by my previous College's method I put together a 'You Said, We Did' announcement. I posted this on our website and social media and emailed a copy to all students. To encourage engagement it is important that your users know their input is valued.

For next year's questionnaire I will:
  • Sort out the issue with Survey Monkey.
  • Try and reduce the number of questions (this year I am implementing several new services which I wanted to gauge interest in. The number of questions - 16 - may also have been why some students failed to complete the questionnaire).
  • Possibly spend some time with students in the other study areas to go through the questionnaire with them and develop conversations.

No comments:

Post a Comment