42
National Learner Satisfaction Survey: Guidance on the core methodology and core questionnaire April 2005 Of interest to everyone involved in conducting learner satisfaction surveys across the post-16 education and training sector

National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

  • Upload
    others

  • View
    4

  • Download
    0

Embed Size (px)

Citation preview

Page 1: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National LearnerSatisfaction Survey:Guidance on the coremethodology andcore questionnaire

April 2005Of interest to everyone involved in conductinglearner satisfaction surveys across the post-16education and training sector

Page 2: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 1 of 38

National Learner Satisfaction Survey: Guidance on the core methodology and core questionnaire

Page 3: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 2 of 38

Contents INTRODUCTION 3

1.1 What is the National Learner Satisfaction Survey (NLSS)? 3

1.2 Why undertake a local level survey? 4

NATIONAL LEARNER SATISFACTION SURVEY METHODOLOGY 4

1.3 NLSS Interviewing Method 4

1.4 Number of Learners Surveyed in NLSS 5

1.5 NLSS Questionnaire 5

INCORPORATING NLSS METHODOLOGY IN A LOCAL LEVEL SURVEY 8

1.6 Good Practice Guidelines 8

1.7 Interviewing Method 9

1.8 Sampling 13

1.9 Survey timing 18

1.10 Questionnaire Design 19

1.11 Survey administration 20

1.12 Data analysis and coding 22

ANNEX A (I): THE QUESTIONNAIRE (TELEPHONE BASED) 27

ANNEX A (II): THE QUESTIONNAIRE (PAPER BASED) 35

ANNEX B: DATA ACCURACY LEVELS 36

ANNEX C: SUGGESTED CROSS TABULATIONS 37

Page 4: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 3 of 38

INTRODUCTION This document has been prepared by NOP and the LSC and incorporates work from a report by GHK. The Learning and Skills Council recommends, as part of new success measures, that local LSCs work with providers to encourage consistency in learner satisfaction surveys, thus allowing comparisons to be made with the National Learner Satisfaction Survey (NLSS). It is recommended that providers undertake learner surveys which monitor quality and help plan resources allocation effectively. The purpose of the approach suggested here is to assist colleges and other providers in the post-16 education sector carry out locally based learner satisfaction surveys which will:

• provide valid results at the local level; and • allow comparisons with the national learner satisfaction survey (NLSS).

Recommendations in this report are on the basis that the methodology derived from the National Learner Satisfaction Survey (NLSS) should be embraced in approaches adopted by local LSCs and providers. 1.1 What is the National Learner Satisfaction Survey (NLSS)? In the academic year 2001/02, the LSC set out to obtain national measures of learner satisfaction in the post-16 provision which it funds. In January 2002, it commissioned NOP World to carry out a survey of learners that involved three waves of interviews per academic year. The resulting survey was initially designed to cover the following learner types:

• those in the further education sector, i.e. in other words, those attending general further education colleges, sixth form colleges and other specialist further education institutions;

• those undertaking work based learning; and

• those undertaking adult and community learning (both accredited and non-

accredited). The option of including school sixth forms in the scope of the survey remains under discussion as does gathering data about further education provision made by higher education institutions. A national picture is valuable in that it provides a comprehensive overview of education and training in England for people aged 16 and over. It also provides a

Page 5: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 4 of 38

facility for tracking the way in which learner satisfaction changes over time, and to look in detail at the experiences of certain sub-groups within the learner base including younger learners and learners from ethnic minority backgrounds. After a comprehensive reporting period in the first year, and positive reaction from stakeholders and providers in terms of the usefulness of the data, the survey continues to be undertaken so that year-by-year comparisons can continue to be made. 1.2 Why undertake a local level survey? One of the first questions to ask when considering carrying out a local or college/provider survey is what extra benefit it will deliver over and above:

• the NLSS results; and • existing learner satisfaction research that may be undertaken.

First: while the NLSS will highlight broad issues and shed light on particular learner groups, it cannot reflect specific challenges and priorities of local LSCs or of individual colleges or providers in all provisions. Thus it can be difficult to isolate findings which are most relevant at local or college/provider level, and to own the results. It follows too, that local data are essential for identifying priorities for action within a particular area or institution. Noting that only when research data have been translated into an agreed action plan, i.e. a tool for mobilising and motivating activity, can they really be regarded as effective in helping to improve quality for learners. Second: framing local research to match national research will provide local colleges and providers with an opportunity to benchmark their results against the national average i.e. it provides a context against which to view and judge the local picture. Such an approach will also encourage standardisation of satisfaction survey instruments within an area, and allow interested parties to take full advantage of the development work that has gone into the NLSS. NATIONAL LEARNER SATISFACTION SURVEY METHODOLOGY 1.3 NLSS Interviewing Method A telephone interview has always been the favoured approach for the NLSS. The reasons for this are:

• response rates: i.e. good response rates are usually achieved;

• facilitated responses: during interviews opportunities are there to probe more fully some open responses and gain quality responses, which may be crucial in gaining understanding of issues lying behind overall satisfaction ratings; and

• cost effectiveness and efficiency compared with other approaches.

Page 6: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 5 of 38

The interviewer asks for respondents and does not accept interviews by proxy with third parties. This ensures first hand access to learners, and direct responses to survey questions. The only situations where a third party may be involved are where a face-to-face interview is needed and a carer is present for support. This is included in the sample design for the NLSS, as we wished to incorporate learners with disabilities and learning difficulties and we offer face to face interviews if respondents or carers believe this is required. Interviews for the 2001/02 and 2002/03 surveys lasted around 20 minutes. In 2003/04, the approach was significantly changed (see later) and interview length reduced to 10 minutes. Interviewers are trained to Market Research Society code of conduct, and quality systems in terms of monitoring exceed the industry standard requirement. 1.4 Number of Learners Surveyed in NLSS The types of learners interviewed as part of the NLSS remain unchanged over the last three years. The only notable difference being that in the first year a pilot survey of non-accredited learners was carried out, but this data has not been published due to relatively low numbers of interviews, and concerns about the quality of information obtainable from a small sample. In 2002/03 and 2003/04, the sample generation method was refined, and the survey included these learners in the reporting. The table below reflects how many interviews have been conducted with various learner types* over the previous three academic years: 2001/02 2002/03 2003/04 No. of interviews No. of interviews No. of interviews

Further education (total) 10,000 19,947 31,786 of which: general FE 5,706 17,369 27,629 sixth form college 3,363 1,935 3,380 other 753 643 777

Work Based Learning 2,032 2,003 6,111 Accredited ACL 723 1,203 1,652 Non-accredited ACL 278 1,958 3,767 (*As noted earlier, the intention with the NLSS has always been to include learning provision at school sixth forms when possible: so far, this has proven to be infeasible. The possibility of covering higher education institutions which deliver further education qualifications, and specialist residential institutions has been discussed and still remain under consideration but no action has been taken to date.) 1.5 NLSS Questionnaire For the 2003/04 survey, the questionnaire underwent significant re-structuring, to reduce interview length from around 20 to 10 minutes, and thus allow for far more

Page 7: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 6 of 38

interviews. Increasing sample size should also enable more robust data analysis for further education at local LSCs. The NLSS now occurs in three waves, where a core set of questions is asked every wave (overall satisfaction, satisfaction with teaching/training and return to learning), and a particular set of modular questions appear on each wave. At the design stage of the questionnaire, the number of areas/issues to be covered is decided by the LSC and questions designed to address these. The final form of the questionnaire is evaluated and approved by an external advisory committee established to guide the progress of the NLSS. The full questionnaire for all three waves is shown in Annex A and is summarised as follows: Core questions (asked every wave) Areas of quality/satisfaction to be evaluated

Question developed for the NLSS

Quality of teaching/ training and lesson/ session management

• Now moving on to teaching/training. Overall, how satisfied are you with the quality of the teaching/training at COLLEGE / PROVIDER / WORKPLACE? Would you say you are (satisfaction scale provided – seven point)

• How would you rate the teachers, tutors or trainers on the following aspects of teaching/training? (list of measures given) Please score on a scale of 1 to 10, where 1 represents very poor and 10 excellent. IF NEEDED – WE RECOGNISE THAT RATINGS MAY VARY FOR INDIVIDUAL TEACHERS/TRAINERS

• Would you say that all of your lessons or training sessions are put to good use or are some wasted?

• Which of the following situations have you encountered on a fairly regular basis? (list of situations relating to lesson/session management provided)

• In general, how do you feel about the feedback on how you are doing from your teachers/tutors? (scale of motivation)

Overall satisfaction

• And now taking all the issues we have discussed into account, how satisfied are you with your current learning experience at COLLEGE / PROVIDER / WORKPLACE? (satisfaction scale provided – seven point)

• You said you were ...(scale of satisfaction).., what is the main reason for this?

Returning to learn? • How likely will you be to undertake further learning in the future (say in the next 3 years?)

Modular questions for wave 6: Pre-entry questions (subject to review/change in 2005) Areas of Question developed for the NLSS

Page 8: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 7 of 38

quality/satisfaction to be evaluated Influences on choice of course/provider

1. Which of the following influenced your choice of course? (learners given list of options)

2. IF 16/17/18 Apart from COLLEGE/WORKPLACE/PROVIDER when you were deciding where to study or train, which OTHER college or provider did you consider? READ OUT LIST. EMPHASISE ‘OTHER’ COLLEGE/PROVIDER

3. And was COLLEGE/WORKPLACE/PROVIDER your first choice i.e. the place where you most wanted to do your course?

4. And was the course you are currently doing your first choice i.e. the subject/qualification that you most wanted to do?

5. What are your main reasons for deciding to attend COLLEGE/WORKPLACE/PROVIDER for your course?

Access and usefulness of advice and information

6. Did you obtain advice about your course or college or provider from any of the following? (learners given list of options)

7. How useful was the advice you received from? Modular questions for wave 7: Learner support questions (subject to review/change in 2005) Areas of quality/satisfaction to be evaluated

Question developed for the NLSS

Administration of the learning

1. Thinking about the site where you do most of your course or training, and of health and safety specifically, which of the following did COLLEGE/WORKPLACE/PROVIDER inform you about (learners given list of options)

2. How well do you think that following issues were managed…(learners are asked to rate quality of administrative issues such as timetabling and communications)

Problems and complaints

3. Since you started the course/training have you had any problems with any of the following? (learners given a list of options)

4. Have you sought advice or help from the COLLEGE/WORKPLACE/PROVIDER on any of these matters?

5. IF SO, Generally how useful was this? 6. Have YOU Ever made a complaint to the

COLLEGE/WORKPLACE/PROVIDER about your course/training or other experiences

7. IF SO, What was your complaint about? 8. Outcome of complaint?

Modular questions for wave 8: Impact questions (subject to review/change in 2005)

Page 9: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 8 of 38

Areas of quality/satisfaction to be evaluated

Question developed for the NLSS

Attitudes and impacts

Feeling about school when left (positive/negative/no feelings either way) I am now going to read out a number of statements that describe the way some people feel about learning and would like you to choose the ones that apply to you and the way that you feel now I am going to read out a few statements about what effect the course may have had on you personally. Could you tell me whether you agree or disagree with each?

INCORPORATING NLSS METHODOLOGY IN A LOCAL LEVEL SURVEY 1.6 Good Practice Guidelines In order for interested stakeholders to derive optimum value from a local survey, we believe it is important that local surveys should be broadly comparable with the approaches adopted for national application. This means that if you are interested in modelling local surveys on the NLSS, and recognise the usefulness of comparing issues against national findings, then you need to ensure that you maintain broad comparability. Before we embark on reviewing methods of interviewing, sampling and data collection it is important to emphasise the areas of good practice that we need to cover.

Good practice check list:

Interviewing method

Sampling Administration of the survey

Survey timing

Questionnaire

Data Analysis and coding

Crucially, we need to recognise every possible method has its own limitations and necessary caveats, but what should be at the heart of every exercise is the pursuit of obtaining reliable data. There are guidelines to follow in this pursuit.

Page 10: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 9 of 38

1.7 Interviewing Method

Telephone interviewing was chosen as the option for the national survey based on:

• speed;

• control over sampling and over the response process; and

• the likelihood of achieving a better response rate than self-completion. It would be ideal if providers and local LSCs seeking to benchmark results against national norms could replicate the sampling and interviewing methods used for the NLSS. (See later explanation and discussion about how other methods can be used to deliver broadly comparable results.) Local LSCs and providers will need to take a view on whether they can sustain the approach adopted by the NLSS by obtaining firm quotes from reputable agencies or organisations e.g. research agencies or alternative call centre operations (see also later discussion about the need for these organisations to adhere to MRS code of conduct). Adopting approaches such as those used in the NLSS, may mean providers carry out their own surveys if they believe this presents them with viable alternatives to out-sourcing the work. Other options for interview approaches include: • face-to-face interviews (not considered as a viable option for the national survey

on the grounds of cost and clustering); • on-line interviews (considered not possible due to access to information

technology for many of the learner types in the sample); and • paper based questionnaire (there is a lower response rate for self completion

questionnaires compared with telephone surveys).

We now investigate the strengths and weaknesses of the research methods noted above. Much of the following information has been taken from the GHK report Seeking and Utilising Learner Feedback which was submitted to the LSC in 2004 as part of its Measures for Success programme. The report is based on interviews with further education providers which aimed to explore the amount of activity and strengths and weaknesses of various approaches to gathering learner feedback. GHK undertook interviewers with 29 providers and explored the use of surveys, issues relating to administering surveys and the dissemination/use of results. A list of strengths and weaknesses were identified by providers for the following research methods and these are summarised in the following table:

Page 11: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 10 of 38

The GHK report highlights other methods that respondents used for collecting learner feedback.

• One-to-one interviews: good for gaining qualitative input but the cost of resourcing means that generally few interviews can be conducted. This therefore raises concerns about obtaining a representative sample from the results.

• Student representatives: a good and valued input, but concerns turn on

inclusiveness, i.e. there is a need to gain feedback from all learners not simply those that attend student meetings and are confident enough to voice opinions in this type of forum.

• Independent research agency - generally positive feedback about this method

(independent, good response rates, low administrative burden for organisation and scope for benchmarking) but the cost for some providers is too high.

Method Strengths Weaknesses

Telephone • Response rates are good • Can provide in-depth

information by gathering data from open and closed questions (can gather qualitative feedback)

Cost. Whilst organisations

appreciate the benefits in terms

of sample control and response

rates, some providers consider

costs are often too expensive for

them to adopt this method.

Paper based surveys • Cost effective • Easy to input information • Easy to provide feedback

Response rates. Various

organisations will quote very

different experiences but in many

cases the response rate is not as

high as the organisation would

have liked.

Online • Quick to complete • Simple • Auto analysis • Results instantaneous • Students enjoy doing it

and can view the results immediately

Learners thought this was a good

idea and the ‘form filling’ was

seen as more enjoyable but

despite this response rates have

been low. There is also the issue

of some learners having access

to IT in order to complete the

survey.

Page 12: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 11 of 38

The report by GHK also provides a series of short case studies illustrating what providers are doing that is useful. There are many examples of good practice already evident in operation. Some of these are given on the following page. The GHK work provides some evidence that providers appreciate, and act on, the value of learner feedback. However, decisions about pursuing feedback are often weighed against factors such as time and costs. The findings are summarised below. Many providers will cite more than one method of gathering feedback, and it must be stressed that if these methods are robust, then this should be seen as positive. There is no single best method here and both practical and cost considerations need to be taken into account. It is worth noting though, that a balanced approach which deploys quantitative and qualitative techniques to obtain learner feedback will usually be regarded as, and prove to be, more advantageous than either approach on its own. This report recommends that, where a quantitative survey is planned, it should ideally generate results which allow comparison with national data. It is not suggested here that you abandon work you are currently committed to using surveys and feedback mechanisms, but that you ensure, as far as possible, that you incorporate strategies for encompassing NLSS core questions into your methodologies.

Practical considerations:

• Cost

• Administrative burden

• Response rates

• Length of survey being

prohibitively long

• Need to include learners with

special needs

• A need for qualitative data

(open ended questions)

Need for feedback:

• Having a measure of quality

• Comparing quality with

national benchmarks

• Troubleshooting

• Better planned resources

• Retention rates/returning to

learning

Page 13: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 12 of 38

Examples of good practice in local surveys:

This case study highlights a paper based questionnaire approach:

Further education Provider: Paper-Based Survey. A provider which administers a paper-based

survey to its learners reported response rates of up to 80 per cent. A lot of effort was spent

initially explaining to learners the benefits of participating in the feedback system before

administering the survey and then quickly responding to the issues raised. This demonstrates to

learners that their views are listened to and action taken from their feedback. Surveys were also

labelled so that quality management staff could contact non-respondents. More sensitive issues

were surveyed anonymously using closed envelopes and drop boxes.

This case study highlights how a provider seeks constant updates on performance:

Work-based learning (WBL) provider: Questionnaires. A WBL provider administers questionnaires

every 6 months to identify learners’ views on induction, learning, work experience and whether

expectations have been met. The advantages of using this method are the ease with which data

can be analysed and collated. Questionnaires are supplemented with 6 weekly reviews with

trainers and assessors. The use of these two methods in tandem ensures that regular contact is

maintained with learners so that problems/issues can be identified and addressed efficiently and

promptly.

This case study highlights the value of qualitative input:

16 - 18 provider: One-to-one reviews. A 16 - 18 provider utilises a range of methods to collect

information on learner satisfaction from learners, the majority of whom are dyslexic. One-to-one

reviews are conducted with both learners and employers as part of the contractual obligation with

LSC. Questionnaires are also administered to students to gather feedback on a quarterly basis.

For example, the first set of questionnaires administered consist of skills assessments to

determine the ability of students and identify learning difficulties. In addition, focus groups are

conducted with groups of young people at the end of the course to identify good practice and for

learners to suggest areas for improvement.

Page 14: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 13 of 38

1.8 Sampling

Optimum sample design is usually that which delivers an equal probability of selection for each individual in the chosen sampling frame i.e. there is no pre-disposition in the construction of the sample which will select some individuals more frequently than others, unless this is the specific intention of the methodology (see discussion hereunder). There are usually two routes to take when first considering sample design. Survey populations First conducting a census survey. In these cases you talk to ALL of the people in the population you are surveying. Accordingly, this is not strictly a sample per se, but the population as a whole, and it generally avoids problems which may arise from the typicality of samples chosen using other methods. For providers it might mean including all learners on a database at any time in any given year. Population surveys are feasible for providers if they are confident that they have an up to date list of ALL learners who have been on courses or programmes throughout the year, and if numbers are low enough, for it to remain a manageable exercise. Where providers are carrying out a self-completion survey, the additional cost of carrying out a census may not be substantial and may equally be outweighed by the benefit of having involved all learners in the exercise. Costs only escalate if, for example, there are a large number of open-ended questions which need to be coded. Sample survey Second, another option is to conduct a sample survey, this means that you only survey a given number of the whole population (e.g. 1,000 learners from a population of several thousand). This is the most common approach to sampling and one in which we envisage most local LSCs and providers will usually wish or need to take. When deciding on the scope of the survey, providers and local LSCs need to explore exactly who they wish to survey. For the national survey, it has been important to cover all programme areas and demography (age, gender, disability, learning difficulty and ethnicity). While providers should adopt the principles and practices discussed below on sampling learners, they will need to determine if there are any groups of learners (e.g. minority ethnic groups) for whom a representative sample will not normally yield enough interviews and data for separate analysis. Where this is the case, you may need to boost the numbers of individuals within these groups to ensure there are enough to be effectively sampled (this is explained later in this section). To complete a sample survey means having up-to-date and comprehensive lists of learners and checking that nothing about the list or the selection method will result in bias towards or against certain groups. As noted above, within any research project, optimum sample design is one that delivers equal probability of selection for each individual in the chosen sampling frame.

Page 15: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 14 of 38

Providers wishing to incorporate the NLSS methodology into their learner satisfaction surveys should be aware of the necessity to chose a sample frame which includes all learners in the group they wish to survey. To achieve this outcome, providers will need as pre-requisites:

• a comprehensive and up-to-date sampling frame, in other words a complete list of learners including early leavers;

or

• another means of tracking individuals which will not result in some individuals

having a greater chance of being identified than others (for instance sampling on site).

For example, sampling all learners on-site on a particular day or other occasion, may well prove likely to result in under-representation of part-time students. For this reason we recommend providers use learner records if these databases are up to date so that a sufficient number of learners are captured by the survey. We would recommend this approach for both questionnaires or telephone surveys. Moreover, it is important to ensure that early leavers are included within the definition of all learners, more so since providers may wish to capture information about the reasons and causes for early leaving among other things. In order to deal with this group, it will probably be necessary to send questionnaires to their home addresses. How to create a sample from a database of all learners In the NLSS the following procedures are used and we recommend that local surveys attempt to match these as far as possible. If a sample survey is being carried out it is important to ensure that:

• lists used as the sampling frame are complete and do not exclude particular groups of students (e.g. early leavers); and

• they are not structured in a way that may cause bias, or even the perception of bias, unless this is deliberately sought (e.g. over-representation of sub-groups).

In designing a sample providers can either take a random 1 in ‘n’ approach, using for example, an alphabetical list of learners. Here ‘n’ equals the total number of learners in the sample. This means that if there is a sample of 500 learners, a 1 in ‘n’ approach involves picking at random one learner out of the 500 possible learners in the sample. Stratified sampling The method used in the NLSS, and therefore the method recommended for use when conducting local surveys, is to use a stratified sampling approach. This

Page 16: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 15 of 38

basically means organising lists of learners into a hierarchy according to different sometimes key variables which are used to ensure adequate representation in the sample. Stratification usually tends to improve reliability, and does this principally by providing guarantees against any unusual occurrences. To give a simple example: it is possible, although admittedly wildly improbable, to draw a simple random sample that consists only of men, or only of people aged 65 or over. Though the chances of this happening are very small indeed, this does give an example of the ways in which an un-stratified sample can be slightly unrepresentative without going to the extremes of being all male or all aged over a certain age. By dividing the sample into two separate groups of males and females, and then by dividing those two groups into three age groups, we can ensure that the sample we select will contain the required number of men aged under 25, women aged between 16-18, and so on. To summarise, the NLSS methodology for creating a learner sample, from a learner database, incorporates the following procedure: Prior to sampling, the database should be stratified by the following variables (these are all present on the ILR):

• college;

• gender;

• ethnicity;

• age;

• student mode of attendance; and

• widening participation factor.

In the NLSS, age was split into the following ages:

• 16 - 18;

• 19 - 24; and

• 25+ or not specified. Sampling was done on a 1 in ‘n’ basis with a computer-generated random start.

Page 17: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 16 of 38

Sample size and complexity In deciding on sample size the key issues to consider are:

• what level of accuracy you require from the data; and

• the extent to which you want to look at different sub-groups of learners. In order to work out the sample size you need, the following ready reckoner table shown below may be helpful. This calculates the reliability for findings generated by different sample sizes. As might be expected, the table shows that the larger the sample size, the more accurate the data finding is i.e. the data is more accurate for a sample size of 600 learners than 400 learners. The NLSS and many other surveys use a confidence level of 95 per cent which means there is only a 1 in 20 (i.e. 5 per cent) chance that the findings are not a true representation of opinion. The formula used in the table takes into account, when conducting a local survey, the total learner population is limited. This is referred to in the table as universe size. Looking at the grid, if a local LSC has 3,000 work-based learners and conducts interviews with a random, un-clustered sample of 400, the figures resulting from the survey will have a maximum error of plus or minus 4.6 per cent. This is based on a 95 per cent confidence level and a finding of 50 per cent. Therefore, for a survey finding of 50 per cent think x there is a 95 per cent probability that the true proportion in the survey population, after rounding, is between 45 and 55 per cent.

Sample size Universe size 400 500 600 +/-%

maximum error

+/-% maximum

error

+/-% maximum

error 1000 3.8 3.1 2.5 1200 4.0 3.3 2.8 1400 4.1 3.5 3.0 1600 4.2 3.6 3.2 1800 4.3 3.7 3.3 2000 4.4 3.8 3.3 3000 4.6 4.0 3.6 6000 4.7 4.2 3.8

Note: this assumes responses are from a completely un-clustered random sample For completeness, in Annex B, we table the accuracy levels at a 95 per cent confidence interval for different finding levels and different sample sizes without a limited universe size.

95% confidence intervals for 50% finding and limited universe size

Page 18: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 17 of 38

We now consider sub-groups that providers might like to analyse separately. We argue that, in order to incorporate NLSS methodology, essential sub-groups to include in sample design are (these may change according to learner type): Gender: Male Female Age: 16 - 18 19 - 24 25+ Age within gender: Male 16 - 18 Male 19+ Female 16 - 18 Female 19+ Mode of attendance: Full-time Part-time Highest qualification on entry: Level 3 or higher Below level 3 Broad course type: Vocational Academic Course level: Level 3 Level 2 or below Some providers may wish to analyse results by another sub-group, e.g. by ethnicity. In order to achieve the required level of accuracy, they will be able to analyse by ethnicity where there is a relatively high proportion (usually 10 per cent or more) of minority ethnic learners. Such an approach will not allow detailed analysis by specified minority ethnic group but will allow a comparison of the views of white students against their minority ethnic counterparts as a whole for example. Colleges or providers with lower proportions (i.e. <10 per cent) than this, which wish to analyse by ethnicity will therefore need to boost the numbers of minority ethnic students in their samples. The same circumstances may well be true of students with disabilities and/or learning difficulties. Taking into account the essential sub-group criteria to be incorporated into the sample design, we recommend the following:

• a minimum achieved sample size of 1,000 for a general further education college; and

• a minimum sample size of 500 work-based learners per local LSC.

The sample size of 500 for work-based learners per local LSC will not allow separate analysis of early leavers, ethnic minority learners or learners with disabilities. If these

Page 19: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 18 of 38

groups are a priority, we recommend that coverage of them will need to be boosted at the sampling stage. This advice also applies to accredited provision in external institutions. However, in this case, it is difficult to recommend a minimum achieved sample size. The minimum achieved sample size is likely to be lower for this provision, compared with further education college and work-based learning, given the number of learners in each institution is lower (on average 1,500). Working on the average of 1,500 learners a sample size of 500 would yield a maximum error of 3.7 per cent in the results obtained.

1.9 Survey timing This section looks at the timing of the survey and applies equally to all interview methods. While we recommend that providers and local LSCs administer surveys at the same time of year as the national survey, we are conscious that this may place some extra pressures on (a) survey organisations who may be involved in both national and local approaches; and (b) involve learners in more than one survey covering largely the same ground. Both situations appear to be slim possibilities but providers should be conscious of potential duplication. As noted above, one factor that may affect comparability between NLSS and local surveys is timing. The NLSS takes place three times a year. The chart below shows the time-scales for each wave for the 2003/04 survey: Wave 1 February 2004 Wave 2 April 2004 Wave 3 June 2004 If surveys involve self-completion and largely involve distribution of questionnaires on college/provider premises, then the survey will generally need to take place during term time. There appear to be, two realistic approaches to the timing of college surveys. Option 1 emulates the NLSS and uses two or three waves in any one year. If this choice is taken, but two waves rather than three are preferred, we recommend that these take place in February and June. Option 2 is to carry out a single survey if the approach noted above appears likely to prove burdensome. In which case we suggest that fieldwork takes place any time between March and May as this should provide comparable data with the bulk of interviews carried out for the NLSS.

Page 20: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 19 of 38

1.10 Questionnaire Design When incorporating the NLSS questions into a local survey it is important to follow guide-lines about questionnaire design. Here we look at design issues for telephone interviews and paper based exercises. Telephone survey questionnaires Wherever possible, questionnaires should be programmed onto CATI (computer assisted telephone interviewing). This is a system whereby interviewers see the questionnaire on a computer screen and the next applicable question is routed for them on the basis of learners’ responses. When using an agency or other organisation to undertake telephone interviews, in order for the data to be comparable with national data, it is important to be clear about the following aspects: • whether or not the interviewer reads out the pre-coded response options to the

respondent; • whether or not they probe fully on the open questions or simply accept the first

response; and • whether or not pre-coded options are rotated for respondents or not. (There are instructions after each question in the version in Appendix A which gives you the above details.) When designing new questions (other than those taken directly from the national survey) it is important to test them in advance of use with actual learner samples. You will need to explore the possibility of at least a pilot so that you can be assured that questions are interpreted correctly and/or that they do not have an ambiguous or multiple meanings. Paper based questionnaires

There are various design issues that will have an impact on the quality of results for a paper based exercise. These are as follows:

• questionnaires need to be reasonably short so that they are not

too daunting a task; • questionnaires need an introduction, keep them short and clear

and ensure that confidentiality is clearly labelled - ensure you include instructions of how to return the questionnaire; and

• if you have the resources to add colour and images then do so

(but don’t make the whole effect too busy).

First impressions

Page 21: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 20 of 38

• questions must be designed on the page so that the respondent

can clearly see the order to the questions (it is tempting to try and fit too many questions onto a page).

• if a question allows respondents to write in an answer then

ensure that they have enough space on the page to do so - again, in the pursuit of making the questionnaire fit onto a given number of pages we are often at fault of restricting open answers respondents may give.

• instructions of how to answer each question must be in the

question text and clear (for example you need to tell the respondent if they need to tick all that apply or one only).

• routing (the questions that are relevant to each respondent) is usually the downfall of a paper based approach – do not include any complicated routing and if it is necessary make it clear where each respondent should go to

1.11 Survey administration This section looks at survey administration and covers issues that relate to telephone interviews and those involved in paper based questionnaires. Telephone interviews If providers wish to use an external or third party for this, they should use an experienced agency or other organisation (e.g. a call centre) where interviewers/operators follow the code of conduct under MRS guidelines. There are best practice guidelines in terms of telephone interviewing and these must be adhered to by every interviewer/operator in order to ensure that interviewer bias has not affected results (www.marketresearch.org.uk). Most research agencies will also provide intensive training courses for new interviewers ensuring they meet MRS guide-lines. Because the quality of interviews is a critical priority for providers investing in telephone surveys this demands particular attention from those involved in commissioning such research. Providers and others may also want to consider if they have age details and other demographics on relevant databases (and that these details are up to date for all learners). This usually means that you can avoid asking learners about these matters during questionnaire administration. If learner responses can be linked via a unique identifier to the learner database, then this is an efficient use of time in terms of interview length. Paper based questionnaires

ALWAYS assume the

worst so clearly label everything

Design and spacing can have a BIG

impact

Page 22: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 21 of 38

The costs for telephones survey interviews may prove too expensive for some providers which means they may need to rely on other methods to gain information about learner satisfaction. Experience suggests that paper based surveys are the next best option. Before beginning a questionnaire survey, providers and others still need to consider how to plan the resources involved and analyse the data. This means considering the following:

• have you the resources available in house to print questionnaires and administer them (post or for completion in class);

• have you the resource in-house to input data in a suitable manner – this may

mean scanning technology or entry via keyboards;

• have you systems and expertise to analyse data – this may mean access to personnel with experience and ability using suitable soft-ware; and

• have you the expertise to interpret data into a readable and actionable format

which allows actions to be taken as well as dissemination. Questionnaire surveys are usually substantially cheaper than telephone surveys, but a potentially lower response rate must be factored into sample design (such surveys will usually need to sample more learners to achieve the same return as from a telephone survey). Non-response rates may also bias data to some degree and this means having in place appropriate mechanisms to deal with the issues which may arise as a result. An analysis of returns should be made at the end of the exercise and an evaluation should highlight any groups for whom response is markedly lower. For instance, there may be a lower response rate for completion of the questionnaire for younger learners compared with older learners. As mentioned earlier, questionnaire administration can use mail to learners’ home addresses, or giving out questionnaires in classrooms. Evidence suggests that for obvious reasons, completing questionnaires in classrooms generally yields higher response rates. A classroom approach also allows teachers or trainers to motivate learners’ responses. Teachers/trainers can also make the advantages of giving feedback on learner perceptions clear i.e. that providers take them seriously, and stress confidentiality issues. There should always be envelopes which can be sealed into which learners can place completed questionnaires. If this route is taken however, and you are using a sample survey (as opposed to a census survey) you may have the occasion for example, where only 10 - 20 learners in any one class have been sampled. The usual way to handle this situation is to ensure that learners sampled this way fill in questionnaires that are coded differently. This means placing an identifier in the corner of the questionnaire to show that learners were sampled on a random basis, and it is the findings on these questionnaires that should be compared with national data. We are not suggesting that you discard other responses, but that they are not used in national comparisons.

Page 23: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 22 of 38

As mentioned above for telephone surveys, you may also want to consider, if you have age details and other demographics on the database (and that these details are up to date for all learners) that you avoid asking these questions on questionnaires you may be using. If learners’ responses can be linked via a unique identifier to the learner database then this will almost certainly improve efficiency.

1.12 Data analysis and coding Coding responses There are three types of question in the NLSS:

• closed questions (such as a list of qualifications a respondent may be learning) which will not probe the respondent further if they give an other response;

• closed questions (such as a list of qualifications a respondent may be

learning) which will probe the respondent further if they give an other response; and

• open – for instance once the respondent has been asked to rate overall

satisfaction they are then asked “why is this?”, where response is typed in verbatim.

All questions that are open or offer another specified option will need to be coded. When coding these types of questions a member of staff should look at the types of comments that are being received, and then cluster them into a broader responses. For instance, when we ask about complaints many are around the same theme and can be aggregated to reflect high degrees of commonality. Quantitative data always needs to be counted and code-frames and coding exercises allow this to be done. The beauty of including open questions is that learners always give verbatim responses. Even though open questions are presented as coded data, actual quotes are included in reports to qualify and elaborate on the evidence of the quantitative data: sometimes colloquially referred to as putting meat on the bones. To ensure that you follow the national survey in terms of opening codes for your data a full code frame in Annex C is given alongside some examples. The examples are useful as a sense check so that you can see what sort of comments would typically fall into a given code that we have opened. On the grounds of comparability, providers and local LSCs should strive to follow these code-frames. Such an approach does not mean that new codes cannot be added, as there may be a particular issue that is local that requires this, but the national code-frames should be used as a foundation to build upon in every instance. Data analysis

Page 24: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 23 of 38

If an outside agency is used, there will be usually be discussions relating to how the data is gathered and represented. Providers and others therefore need to ensure they are well informed about what is expected from a survey, the extent of sampling involved, how the data obtained will be managed, and what reports or results they should receive. This is not a passive process, and means active participation by providers and others to obtain best value from surveys. As a minimum the following variables will usually be needed for cross-tabulations of the data (Annex C gives a full breakdown): Learner demographics:

• age;

• gender;

• age within gender;

• disability;

• learning difficulty;

• ethnicity; and

• highest qualification on entrance Learning aims:

• mode of attendance;

• programme area;

• qualification;

• qualification level; and

• if work based learning is learning in the workplace or a form of mixed provision Calibrating data between telephone interviews and paper based surveys Throughout this commentary we recommend that, as far as possible, providers incorporate NLSS methodology when conducting local surveys. However, when providers choose to undertake paper based surveys, the following provides advice about how to calibrate data to allow local benchmarking with national data. Alongside the NLSS telephone interviews, LSC contracted NOP World to undertake a parallel paper-based test in 2003. A paper-based survey with a large provider in the Midlands was conducted at the same time as one of the waves of interviewing via telephone. The objective was to arrive at a means of allowing a paper-based

Page 25: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 24 of 38

questionnaire to be compared with national data as this is collected using a telephone survey. The following show how results from a paper-based survey should be manipulated so that it allows for a best comparison with national data. Calculating calibration factors An additive calibration factor is used for simplicity. The difference between the telephone and paper based questionnaire scores was calculated for relevant outcomes from the survey for most of the core questions. This produced a calibration factor for each question. The calibration factor was calculated by comparing a response from the telephone survey with the same response from a paper based survey. The average result for both telephone and paper based response was calculated and proportioned to the number of interviews that took place using each method. This is known as the weighted mean score. The calibration factor was therefore calculated by taking the weighted mean score from the telephone survey and subtracting it from the weighted mean score from the paper based questionnaire. The equivalent method is used to calculate the calibration scores using percentages as the data description. The calibration factor represents our best estimate of the likely difference between the outcomes under different methodologies for each question. Using calibration factors The calibration factors, when added to the outcomes (percentages or mean scores as defined) from a paper based questionnaire will produce a score or percentage which represents a best estimate of what the outcome would have been if a telephone survey had been used. The following tables show how the calibration factors should be used.

Page 26: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 25 of 38

Calibration scores using mean scores as the data description Calibration scores in the table below represent the numbers that need to be added to findings generated in your survey from a paper based questionnaire. The questions we have this for are given in the left hand column and the factor represents what you need to add to the mean scores generated from responses to these questions. For instance Q22a asks learners to rate teachers/tutors from 1 to 10 out of 10 for knowledge of the subject, where 10 is excellent. If you generate the data for the response to this question in your own paper based survey and derive a mean score of 8.06 out of 10, the table shows you that you will need to add 0.01 to this mean score making it 8.07. This means that your mean score is now the best possible comparison with the national survey given that you generated the data via a paper based survey rather than a telephone survey. Descriptive Statistics

Calibration factor

Q22A Knowledge of the subject 0.01

Q22B How well they relate to you as a person 0.03

Q22C Making your subject interesting or enjoyable for you -0.10

Q22D Understanding you and how you like to learn 0.08

Q22E The support they give you for example in improving your study techniques or time management -0.10

Q22F Planning their lessons -0.39

Q22G The quality and availability of teaching materials they use -0.31

Q22H Setting clear targets or learning goals to help you improve -0.20

Q22I Providing prompt and regular feedback on progress -0.21

Q22J Managing the group of learners -0.33

Q38. And now taking all the issues we have discussed into account, how satisfied are you with your current learning experience? 0.01

Page 27: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 26 of 38

Calibration scores using percentages as the data description Calibration scores in the table below represent the numbers that need to be added to findings generated in your survey from a paper based questionnaire. The questions we have this for are given in the left hand column and the factor represents what you need to add to the percentages generated from responses to these questions. For instance Q22a asks learners to rate teachers/tutors from 1 to 10 out of 10 for knowledge of the subject, where 10 is excellent. If you generate the data for the response to this question in your own paper based survey and come out with a percentage of 85 per cent of learners giving teachers/tutors at least 8 out of 10, the table shows you that you will need to add 0.7per cent to this percentage making it 85.7 per cent or 86 per cent if rounding. This means that your percentage is now the best possible comparison with the national survey given that you generated the data via a paper based survey rather than a telephone survey. Descriptive Statistics

Calibration factor

Q22A Knowledge of the subject 0.7%

Q22B How well they relate to you as a person 0.8%

Q22C Making your subject interesting or enjoyable for you -2.6%

Q22D Understanding you and how you like to learn 0.7%

Q22E The support they give you for example in improving your study techniques or time management 0.5%

Q22F Planning their lessons -7.4%

Q22G The quality and availability of teaching materials they use -10.8%

Q22H Setting clear targets or learning goals to help you improve -2.6%

Q22I Providing prompt and regular feedback on progress -5.0%

Q22J Managing the group of learners -9.0%

Q38 And now taking all the issues we have discussed into account, how satisfied are you with your current learning experience? 0.2%

Q45B How likely will you be to undertake further learning in the future (say in the next 3 years?) 5.7%

Page 28: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 27 of 38

ANNEX A (I): THE QUESTIONNAIRE (TELEPHONE BASED) The Core questions were as follows:

ASK ALL Q21 Now moving on to teaching/training. Overall, how satisfied are you with the quality of the teaching/training at COLLEGE / PROVIDER / WORKPLACE? Would you say you are: READ OUT AND CODE ONE ONLY Extremely satisfied.............................. 1 Very satisfied................................... 2 Fairly satisfied................................. 3 Neither satisfied nor dissatisfied............... 4 Fairly dissatisfied.............................. 5 Very dissatisfied................................ 6 Extremely dissatisfied........................... 7 Don't know....................................... Y ASK ALL Q22 How would you rate the teachers, tutors or trainers on the following aspects of teaching/training? Please score on a scale of 1 to 10, where 1 represents very poor and 10 excellent. IF NEEDED – WE RECOGNISE THAT RATINGS MAY VARY FOR INDIVIDUAL TEACHERS/TRAINERS - PLEASE TRY YOUR BEST TO GIVE US AN OVERALL RATING. (PRECODES ROTATED) -1- Knowledge of the subject -2- How well they relate to you as a person -3- Making your subject interesting or enjoyable for you -4- Understanding you and how you like to learn -5- The support they give you for example in improving your study techniques or time management -6- Planning their lessons -7- The quality and availability of teaching materials they use -8- Setting clear targets or learning goals to help you improve -9- Providing prompt and regular feedback on progress -10- Managing the group of learners 1 - Very Poor 2 -............... 3 -............... 4 -............... 5 -............... 6 -............... 7 -............... 8 -............... 9 -............... 10 – Excellent No answer Don't know

Page 29: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 28 of 38

ASK ALL Q23A Would you say that all of your lessons or training sessions are put to good use or are some wasted? READ OUT AND CODE ONE ONLY More than half are wasted........................ 1 Between a quarter and a half are wasted......... 2 10-24% are wasted................................ 3 Less than 10% are wasted......................... 4 None are wasted.................................. 5 Don't know....................................... Y ASK ALL Q24B Which of the following situations have you encountered on a fairly regular basis? READ OUT. PROBE FOR OTHERS Being left hanging around with nothing to do..... 1 Teachers/tutors arriving late by 5 minutes or more............................................. 2 Other students arriving late by 5 minutes or more 3 Other students making a noise and disrupting class............................................ 4 Teachers going at too slow a pace................ 5 Teachers/tutors being absent..................... 6 Lessons being badly planned/disorganised......... 7 Lack of resources/poor equipment................. 8 Lessons finishing early.......................... 9 Lessons finishing late........................... 0 Teachers going at too fast a pace................ 1 Other (specify).................................. 2 ASK ALL Q26 In general, how do you feel about the feedback on how you are doing from your teachers/tutors? READ OUT. CODE ONE ONLY. PROBE FOR OTHERS Motivating....................................... 1 Demotivating..................................... 2 No effect either way............................. 3 Other (specify) 0 No answer........................................ X Don't know....................................... Y ASK ALL Q38 And now taking all the issues we have discussed into account, how satisfied are you with your current learning experience at COLLEGE / PROVIDER / WORKPLACE? READ OUT.CODE ONE ONLY Extremely satisfied.............................1 Very satisfied................................... 2 Fairly satisfied................................. 3 Neither satisfied nor dissatisfied...... 4 Fairly dissatisfied..............................5 Very dissatisfied............................... 6 Extremely dissatisfied.................... 7 Don't know....................................... Y ASK ALL Q39 You said you were ....., what is the main reason for this? PROBE FULLY Refused.......................................... { Don't know.......................................Y

Page 30: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 29 of 38

ASK ALL Q45B How likely will you be to undertake further learning in the future (say in the next 3 years?) Very likely...................................... 1 Fairly likely.................................... 2 Fairly unlikely.................................. 3 Very unlikely.................................... 4 Don't know....................................... Y Modular questions

The FULL content of the questionnaire for each wave was as follows: Wave 6 Core questions plus questions about pre entry Wave 7 Core questions plus questions about support (inc. health and safety) Wave 8 Core questions plus questions about impact of learning

Modular questions for wave 6: Pre entry questions

NQ10 Which of the following influenced your choice of course? READ OUT, PROBE FOR OTHER Because it fitted in with my future career plans. 1 It was relevant to my job........................ 2 Because I like the subject....................... 3 To gain qualifications........................... 4 Because it was on offer at my local college/this college.......................................... 5 Because it was at a time that suited me.......... 6 To advance my skills and knowledge in this area.. 7 Suggested or required by employer................ 8 Suggested by others.............................. 9 For my own personal interest..................... 0 To meet other people/make new friends............ 1 Other (specify).................................. IF 16/17/18 YEARS OLD Q11B Apart from COLLEGE/WORKPLACE/PROVIDER, when you were deciding where to study or train, which OTHER college or provider did you consider? Did you consider….READ OUT. INTERVIEWER: EMPHASISE ‘OTHER’ COLLEGE OR PROVIDER A different further education college...............1 School sixth form................................ 2 Sixth form college............................... 3 Specialist sixth form (e.g. art and design college, agricultural college)................... 4 Modern Apprenticeship............................ 5 No other options considered 6 Other (specify) 0 Don't know....................................... Y

Page 31: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 30 of 38

ASK ALL FE (NOT WBL) Q11BII And was COLLEGE/WORKPLACE/PROVIDER your first choice i.e. the place where you most wanted to do your course? Yes 1 No 2 Didn’t have any choice 3 ASK ALL Q11BIII Was the course you are currently doing your first choice i.e. the subject and qualification you most wanted to do? Yes 1 No 2 Didn’t have any choice 3 ASK ALL NQ11C What are your main reasons for deciding to attend COLLEGE / PROVIDER / WORKPLACE for your course? DO NOT READ OUT. CODE ALL THAT APPLY Convenient location/nearest...................... 1 Offered course I wanted.......................... 2 Has best reputation (general).................... 3 Has best reputation for pass rates............... 4 Has best reputation for my course................ 5 Friends were going there/friend recommended...... 6 Recommended by career advisor/school............. 7 Offered a course at convenient times for me...... 8 Had no choice -employer chose.................... 9 Had no choice – only one that accepted me 0 Had no choice – no other providers in this area 1 ASK ALL Q15D Did you obtain advice about your choice of course or college/provider from any of the following? READ OUT, CODE ALL THAT APPLY. (PRECODES ROTATED) College admissions office........................ 1 Teachers at school............................... 2 Teachers/tutors at college/provider.............. 3 Parents or other family members.................. 4 Friends.......................................... 5 Employer......................................... 6 School Careers Advisor........................... 7 Advisor at Connexions............................ 8 Advisor at an information and guidance centre.... 9 Other 1.......................................... 0 Other 2.......................................... 1 Other 3.......................................... 2 ___________________________________________________________________

Page 32: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 31 of 38

FOR MAXIMUM OF TWO MENTIONS AT PREVIOUS QUESTION (IF RESPONDENT MENTIONS MORE THAN TWO CATI WILL RANDOMLY SELECT TWO AGENCIES) Q16 How useful was the advice you received from .... College admissions office........................ 1 Teachers at school............................... 2 Teachers/tutors at college/provider.............. 3 Parents or other family members.................. 4 Friends.......................................... 5 Employer......................................... 6 School Careers Advisor........................... 7 Advisor at Connexions............................ 8 Advisor at an information and guidance centre.... 9 Other 1.......................................... 0 Other 2.......................................... 1 Other 3.......................................... 2 Very useful....... 1 1 1 1 1 1 1 Fairly useful..... 2 2 2 2 2 2 2 Not very useful... 3 3 3 3 3 3 3 Not useful at all. 4 4 4 4 4 4 4 Don't know........ Y Y Y Y Y Y Y Modular questions for wave 7: Learner support questions ASK ALL NQ20 Thinking about the site where you do most of your course or training, and of health and safety specifically, which of the following did COLLEGE / PROVIDER / WORKPLACE inform you about ? READ OUT AND CODE (YES/NO) (PRECODES ROTATED) -1- Emergency arrangements for fire -2- Emergency arrangements for first aid and how to report an accident -3- Who to ask for any health and safety advice or instructions -4- Any dangers involved with your training and how to work safely

NQ27 How well do you think the following issues were managed... Please score on a scale of 1 to 10, where 1 represents very poor and 10 excellent. IF NEEDED - WE RECOGNISE THAT RATINGS MAY VARY OVER TIME - PLEASE TRY YOUR BEST TO GIVE US AN OVERALL RATING. (PRECODES ROTATED). USE NOT APPLICABLE IF NEEDED - PARTICULARLY FOR SOME OF THE ASSESSOR RELATED PRECODES

Making sure enough teachers/tutors/ trainers and/or assessors are available 1 Providing support when I or other learners have problems 2 Helping new people settle in 3 Managing timetables so that they suit the learner as best they can 4 Communicating changes in times for sessions 5 Teachers/Tutors/Assessors turning up as planned 6

Seeing the same teacher/tutor/assessor throughout 7

Page 33: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 32 of 38

ASK ALL Q29 Since you started the course have you had any problems with any of the following? READ OUT AND PROBE FOR OTHERS. IF NEEDED DESCRIBE DISCRIMINATION AS HARRASSMENT/BULLYING/TREATED UNFAIRLY (PRECODES ROTATED) Managing to fit course commitments in with other commitments at home.............................. 1 Managing to fit course commitments in with other commitments at work.............................. 2 Managing to keep up with the standard of work required......................................... 3 Dealing with money pressures..................... 4 Discrimination of any kind from other students... 5 Discrimination of any kind from a member Of COLLEGE/PROVIDER/WORKPLACE staff?. 6 Maintaining your personal motivation............. 7 Travel to college/training centre................ 8 Extra help you were promised not being provided.. 9 Reading/writing skills........................... 0 Maths or numeracy skills......................... 1 Other^o.......................................... 2 No answer........................................ X Refused.......................................... { Don't know....................................... Y ______________________________________________________________________________ IF CODED OTHER Q29OTH What other problems? IF YES TO ANY STATEMENTS AT Q29 Q30 Have you sought advice or help from the COLLEGE/PROVIDER/WORKPLACE on any of these matters? Yes.............................................. 1 No............................................... 2 Don't know....................................... Y ______________________________________________________________________________ IF YES AT Q30 Q31 Generally, how useful was this? Very useful...................................... 1 Fairly useful.................................... 2 Not very useful.................................. 3 Not at all useful................................ 4 Don't know....................................... Y ______________________________________________________________________________ Q32 ALL EXCEPT WORK BASED LEARNERS IN THE WORKPLACE – Have you ever made a complaint to the college about your course or other experiences? WORK BASED LEARNERS IN THE WORKPLACE – Have you ever made a complaint to your employer about your training? Yes.............................................. 1 No............................................... 2 Don't know....................................... Y ______________________________________________________________________________

Page 34: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 33 of 38

IF YES AT Q32 Q32I What was your complaint about? PROBE THOROUGHLY Don't know.......................................Y ______________________________________________________________________________ IF YES AT Q32 Q33 Which of these best describes the outcome to your complaint? There was an outcome that satisfied me........... 1 There was an outcome but it did not satisfy me... 2 There was no outcome to the complaint at all..... 3 Don't know.......................................

Modular questions for wave 8: Impact questions ASK ALL Q40 Thinking back to when you left school would you say that you had...READ OUT. CODE ONE ONLY Generally positive feelings about education...... 1 Generally negative feelings about education...... 2 Not bothered either way about education - indifferent...................................... 3 No answer........................................ X Don't know....................................... Y ___________________________________________________________________ ASK ALL Q41 I am going to read out a number of statements which describe the way some people feel about learning and would like you to choose the ones that apply to you and the way you feel now: READ OUT AND CODE ALL THAT APPLY. PROBE FOR OTHER (PRECODES ROTATED) I enjoy learning and get a buzz from it.......... 1 I am carrying on learning because I can't think of anything better to do......................... 2 I enjoy learning mostly because of the social aspects.......................................... 3 I don't really enjoy learning ................... 4 NULL....................................... Y

Page 35: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 34 of 38

ASK ALL Q42II I am going to read out a few statements about what effect the course may have had on you personally. Could you tell me whether you agree or disagree with each. (PRECODES ROTATED) -1- I have a greater enthusiasm for the subject -2- It has given me skills I can use for a job -3- I feel more confident socially -4- I feel more confident in my ability to learn -5- I am better at managing my time and responsibilities -6- I feel more positive about learning than I did when I started -7- I am more creative and prepared to try new things -8- I am better at learning on my own now -9- It enables me to cope better with daily life -10- It has benefited my health and sense of well being -11- I now take a more active part in the community (ONLY IF RESPONDENT IS 25 PLUS) Agree............. 1 1 1 1 1 1 Disagree.......... 2 2 2 2 2 2 Don't know........ Y Y Y Y Y Y

Page 36: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 35 of 38

ANNEX A (II): THE QUESTIONNAIRE (PAPER BASED)

Page 37: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

11 OOvveerraallll,, hhooww ssaattiissffiieedd aarree yyoouu wwiitthh tthhee qquuaalliittyy ooff tthhee tteeaacchhiinngg//ttrraaiinniinngg aatt tthhiiss ccoolllleeggee//pprroovviiddeerr??Cross one box only.

Extremely satisfied

Very satisfied

Fairly satisfied

Neither satisfied nor dissatisfied

Fairly dissatisfied

Very dissatisfied

Extremely dissatisfied

Don’t know

22 HHooww wwoouulldd yyoouu rraattee tthhee tteeaacchheerrss,, ttuuttoorrss oorr ttrraaiinneerrss oonn tthhee ffoolllloowwiinngg aassppeeccttss ooff tteeaacchhiinngg//ttrraaiinniinngg??Please score on a scale of 1 to 10, where 1 represents very poor and 10 excellent. We recognise that ratings may vary for individual teachers/tutors -Please try your best to give us an overall rating. Cross one box on each row.

Very poor Excellent Don’t1 10 know

Knowledge of the subject

How well they relate to you as a person

Making your subject interesting or enjoyable for you

Understanding you and how you like to learn

The support they give you for example in improving your study techniques or time management

Planning their lessons

The quality and availability of teaching materials they use

Setting clear targets or learning goals to help you improve

Providing prompt and regular feedback on progress

Managing the group of learners

33 WWoouulldd yyoouu ssaayy tthhaatt aallll ooff yyoouurr lleessssoonnss oorr ttrraaiinniinngg sseessssiioonnss aarree ppuutt ttoo ggoooodd uussee oorr aarree ssoommee wwaasstteedd??Cross one box only.

More than half are wasted

Between a quarter and a half are wasted

10-24% are wasted

Less than 10% are wasted

None are wasted

Don’t know

LEARNER QUESTIONNAIREAll responses will be handled in the strictest confidence.

Using BLACK ink, please indicate your answers with a cross .

Please return your completed questionnaire to your tutor by

NAME OF COLLEGE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

COURSE TITLE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

LEVEL . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

START DATE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

X

Page 38: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

44 WWhhiicchh ooff tthhee ffoolllloowwiinngg ssiittuuaattiioonnss hhaavvee yyoouu eennccoouunntteerreedd oonn aa ffaaiirrllyy rreegguullaarr bbaassiiss??Cross all that apply.

Being left hanging around with nothing to do

Teachers/tutors arriving late by 5 minutes or more

Other students arriving late by 5 minutes or more

Other students making a noise and disrupting class

Teachers going at too slow a pace

Teachers going at too fast a pace

Teachers/tutors being absent

Lessons being badly planned/disorganised

Lack of resources/poor equipment

Lessons finishing early

Lessons finishing late

Other (please write in)

None of these

Don’t know

55 IInn ggeenneerraall,, hhooww ddoo yyoouu ffeeeell aabboouutt tthhee ffeeeeddbbaacckk yyoouu rreecceeiivvee ffrroomm yyoouurr tteeaacchheerrss//ttuuttoorrss oonn hhooww yyoouu aarree ddooiinngg??Cross one box only.

Motivating

Demotivating

No effect either way

Don’t receive any/enough feedback

Other (please write in)

Don’t know

66 AAnndd nnooww ttaakkiinngg aallll tthhee iissssuueess wwee hhaavvee ddiissccuusssseedd iinnttoo aaccccoouunntt,, hhooww ssaattiissffiieedd aarree yyoouu wwiitthh yyoouurr ccuurrrreenntt lleeaarrnniinngg eexxppeerriieennccee aatt ccoolllleeggee//pprroovviiddeerr??Cross one box only.

Extremely satisfied

Very satisfied

Fairly satisfied

Neither satisfied nor dissatisfied

Fairly dissatisfied

Very dissatisfied

Extremely dissatisfied

Don’t know

77 YYoouu hhaavvee jjuusstt ggiivveenn uuss aa rraattiinngg ooff oovveerraallll ssaattiissffaaccttiioonn.. WWhhyy ddiidd yyoouu ssaayy tthhiiss??

88 HHooww lliikkeellyy wwiillll yyoouu bbee ttoo uunnddeerrttaakkee ffuurrtthheerr lleeaarrnniinngg iinn tthhee ffuuttuurree ((ssaayy iinn tthhee nneexxtt 33 yyeeaarrss??))

Very likely

Fairly likely

Fairly unlikely

Very unlikely

Don’t know

please go to Q8

please go to Q8

OFFICE USE ONLY

NOP450431 01/05KW Ver1.2

Page 39: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 36 of 38

ANNEX B: DATA ACCURACY LEVELS The accuracy levels at a 95 per cent confidence interval for different finding levels and different sample sizes without a limited universe size.

For example, with a sample size of 1,000, there is a maximum error of 3.1 per cent estimated at the 95 per cent confidence level on the basis of a 50 per cent finding. This means that for a survey finding of 50 per cent think x, there is a 95 per cent probability that the true proportion in the population, after rounding, lies between 47 per cent and 53 per cent. The more people you interview, the more reliable the data become. A 95 per cent confidence level means there is only a 1 in 20 chance that the findings are not a true representation of opinion.

95% confidence intervals for different expected proportions and sample sizes Expected Proportion Sample Size

50% 40% 30% 20%

Or 50% 60% 70% 80% 100 +/- 9.8% +/- 9.6% +/- 9.0% +/- 7.8% 200 +/- 6.9% +/- 6.8% +/- 6.4% +/- 5.5% 300 +/- 5.7% +/- 5.5% +/- 5.2% +/- 4.5% 400 +/- 4.9% +/- 4.8% +/- 4.5% +/- 3.9% 500 +/- 4.4% +/- 4.3% +/- 4.0% +/- 3.5% 600 +/- 4.0% +/- 3.9% +/- 3.7% +/- 3.2% 700 +/- 3.7% +/- 3.6% +/- 3.4% +/- 3.0% 800 +/- 3.5% +/- 3.4% +/- 3.2% +/- 2.8% 900 +/- 3.3% +/- 3.2% +/- 3.0% +/- 2.6% 1,000 +/- 3.1% +/- 3.0% +/- 2.8% +/- 2.5% 1,500 +/- 2.5% +/- 2.5% +/- 2.3% +/- 2.0% 2,000 +/- 2.2% +/- 2.1% +/- 2.0% +/- 1.8% 2,500 +/- 2.0% +/- 1.9% +/- 1.8% +/- 1.6% Note: this assumes the responses are from a completely unclustered random sample

Page 40: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 37 of 38

ANNEX C: SUGGESTED CROSS TABULATIONS We have based the following suggestions on a survey of 1,000 learners in a further education college and 500 learners in a work-based learning survey. This means that we do not have vast scope for analysing sub-sets, but it shows the minimum by which you should be hoping to analyse by. We do not suggest that you look at any sub-sets of the sample with less than 100 learners in them. FURTHER EDUCATION (POSSIBLE CROSS TABULATIONS BASED ON A TOTAL OF 1000 INTERVIEWS) Learner demographics Age 16 - 18

19 - 24 25 – 34 (see NLSS where this is 25+) 35 plus (see NLSS where this is 25+)

Gender Male Female

Age within gender Males 16 - 24 Females 16 - 24 Males 25 plus Females 25 plus

Disability Yes No

Learning difficulty Yes No

Ethnicity White Minority ethnic group

Highest qualification held Level 3 or higher Below level 3

Learning aim Mode of attendance Full time

Part time Area of learning Depending on provider you will need to try to

collapse 14 areas of learning in the best way possible for what you deliver – aim to have at least 100 learners in each collapsed grouping

Qualification NVQ Other vocational Academic

Level Level 3 or higher Below level 3

Page 41: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

National Learner Satisfaction Survey Guidance on Methodology

Page 38 of 38

WORK BASED LEARNING (POSSIBLE CROSSTABULATIONS BASED ON A TOTAL OF 500 INTERVIEWS) Learner demographics Age 16 - 18

19 - 24 25 plus

Gender Male Female

Disability Yes No

Learning difficulty Yes No

Ethnicity White Ethnic minority group

Highest qualification held Level 3 or higher Below level 3

Learning aim Mode of attendance Full time

Part time Area of learning Depending on provider you will need to try to

collapse 14 areas of learning in the best way possible for what you deliver – aim to have at least 100 learners in each collapsed grouping

Qualification NVQ Other vocational

Level Level 3 or higher Below level 3

Delivery Workplace only Workplace and provider Provider only

Page 42: National learner satisfaction survey: guidance on the core ... · National Learner Satisfaction Survey Guidance on Methodology Page 4 of 38 facility for tracking the way in which

Learning and Skills Council National Office Cheylesmore House Quinton Road Coventry Road CV1 2WT T 0845 019 4170 F 024 7682 3675 www.lsc.gov.uk/ NOP Reseach group is the UK arm of NOP World, the ninth largest research agency in the world. In the UK, NOP employs around 600 staff and it is split into specialist divisions. The team working on the LSCs national learner satisfaction survey is in the division that specialises in social research and there are 20 researchers dedicated to this field. The team has relevant experience of both large-scale social research surveys and of carrying out customer satisfaction research.

© LSC April 2005 Published by the Learning and Skills Council. Extracts from this publication may be reproduced for non-commercial educational or training purposes, on condition that the source is acknowledged and the findings are not misrepresented. This publication is available in electronic form on the Learning and Skills Council website: www.lsc.gov.uk For hard copies of our publications in other formats and languages, call our publication enquires: 0870 900 6800 Publication reference: LSC-P-NAT-050168