Telephone versus face-to-face interviewing: mode effects on data quality and likely causes: report on phase II of the ESS-Gallup mixed mode methodology project

Publication type

ISER Working Paper Series

Series Number

2006-41

Series

ISER Working Paper Series

Authors

Publication date

August 17, 2006

Summary:

The European Social Survey currently insists on face-to-face interviewing as its sole mode of data collection. However, owing to the mounting costs of carrying out face-to-face interviews and the divergent traditions and experiences of survey research across the different countries participating in the survey, there is a growing need to explore the alternatives offered by mixed mode data collection designs. Even relatively simple mixed mode designs - such as a switch to telephone interviewing in a small number of countries - could however threaten data quality, disrupting the continuity of the time-series for the countries concerned, as well as affecting the validity of cross-cultural comparisons.

The present experimental study is part of an ongoing methodological programme of research designed to inform decisions by the European Social Survey on whether to move to a mixed data collection strategy in the future, and if so, which modes to mix and how. The aim of phase II, the object of this report, was to assess the likely impact of a switch to telephone interviewing on data quality and to investigate the causes of mode effects in order to identify ways of mitigating these.

The main differences between face-to-face and telephone interviewing are the channels of communication and the physical presence of the interviewer. In a face-to-face setting, showcards can be used to make it easier for the respondent to understand questions and remember response categories. Over the telephone this aid is not available, making the response task more challenging. Similarly, the physical presence of the interviewer means that a range of non-verbal channels of communication are available. The interviewer may detect signs of waning motivation or misunderstanding and frustration on the part of the respondent and react to these more easily than over the telephone. Finally, face-to-face respondents are less likely to be engaged in other activities while answering survey questions and interviews are typically carried out at a slower pace than over the telephone.

As a result of these differences between the two modes, telephone respondents are likely to make less effort in answering survey questions (referred to as satisficing), resulting in different response distributions. For example, telephone respondents are more likely to say 'yes' or 'agree' and more likely to choose the same answer category for batteries of questions using the same scale. Face-to-face respondents may in turn report sensitive behaviours or attitudes less truthfully, since they will be more aware of the interviewer's reaction to their answers than a telephone respondent would be. As a result, face-to-face respondents may be more likely to edit responses to appear in a more favourable light (referred to as social desirability bias). On the other hand, it may be easier for interviewers to establish rapport in a face-to-face setting. As a result the respondent might feel more comfortable reporting socially undesirable behaviours or attitudes.

Although previous studies have tested differences in responses across modes, their ability to infer the likely causes of differences was often limited. It is, for example, often not possible to distinguish whether the observed mode differences are a function of characteristics of the question (including question wording and response alternatives or the degree of sensitivity or complexity), characteristics of the mode (such as the presence or absence of an interviewer or the channel of communication (visual or aural) of the question stimulus and response) or characteristics of the respondent (such as propensity to satisfice or to give socially desirable responses).

Our study enabled us to distinguish mode effects caused by differences in the type of question stimulus used in each mode (audio vs. visual) and mode effects caused by the presence or absence of the interviewer. Since the European Social Survey relies heavily on the use of showcards, disentangling these effects is particularly important. The design included three comparison groups: two interviewed face-to-face (one with showcards, one without) and the third by telephone.

Mode significantly affected response distributions for over a third of the items tested. The differences between modes appeared to be small however, and did not affect the overall relationships between variables. Since the items included in the experiment were those deemed most sensitive to mode effects, the findings suggested that a switch to telephone mode might not affect the conclusions analysts would draw from the ESS data.

Most differences appeared to be due to the presence of the interviewer rather than the sensory channel, since mode effects were observed between face-to-face and telephone modes, but not between the two face-to-face groups. In general, we found no evidence that using showcards influenced response quality, either positively or negatively. This suggests that the ESS showcard questions were successfully adapted for the use over the telephone, by keeping modifications to a minimum. The main problems arose for the adaptation of numerical questions (about household income and hours spent watching television) which are formulated as banded questions in the ESS. Changing these to open-ended questions for the telephone resulted in large differences in response distributions.

Unlike previous studies, we found no support for the hypothesis that telephone respondents are more likely to satisfice. This suggests that the presence of the interviewer neither affected the difficulty of the response task nor the effort made by respondents. The experimental survey was, however, much shorter and more varied than the full ESS survey and the possibility that telephone interviewing could lead to more satisficing can therefore not be excluded.

The most notable finding was that telephone respondents were more likely to give socially desirable responses across a range of indicators. This suggests that the advantages of trust built up in the face-to-face interview outweighed any disadvantages due to the lack of anonymity. In order to mitigate this effect, more research is, however, needed to understand the cognitive processes underlying social desirability bias. The traditional theory suggests that social desirability is the result of deliberate editing of responses. In our study, however, social desirability bias was more prevalent with telephone interviewing, although these interviews were conducted at a faster pace suggesting that respondents did not take additional time to edit their responses. Instead, respondents may have selected the most socially desirable response because it was the easiest, most accessible or salient response available to them without expending much effort on answering the survey question. Depending on the cause, the implications for reducing the impact of social desirability bias are clearly very different. Finally, little is known about cultural differences in social desirability bias and the extent to which our findings would replicate for other countries participating in the ESS. Differences may exist in the connotations of particular subjects, the social norms governing different types of behaviour and the importance of impression management strategies.

Subject

Notes

Is referenced by OECD (2013) ‘Methodological considerations in the measurement of subjective well-being’ in OECD ’OECD Guidelines on Measuring Subjective Well-being’. Paris: OECD Publishing. Ch.2: 61-138

Paper download  

#508652

News

Latest findings, new research

Publications search

Search all research by subject and author

Podcasts

Researchers discuss their findings and what they mean for society

Projects

Background and context, methods and data, aims and outputs

Events

Conferences, seminars and workshops

Survey methodology

Specialist research, practice and study

Taking the long view

ISER's annual report

Themes

Key research themes and areas of interest