Root out friction in every digital experience, super-charge conversion rates, and optimize digital self-service

Uncover insights from any interaction, deliver AI-powered agent coaching, and reduce cost to serve

Increase revenue and loyalty with real-time insights and recommendations delivered to teams on the ground

Know how your people feel and empower managers to improve employee engagement, productivity, and retention

Take action in the moments that matter most along the employee journey and drive bottom line growth

Whatever they’re are saying, wherever they’re saying it, know exactly what’s going on with your people

Get faster, richer insights with qual and quant tools that make powerful market research available to everyone

Run concept tests, pricing studies, prototyping + more with fast, powerful studies designed by UX research experts

Track your brand performance 24/7 and act quickly to respond to opportunities and challenges in your market

Explore the platform powering Experience Management

  • Free Account
  • For Digital
  • For Customer Care
  • For Human Resources
  • For Researchers
  • Financial Services
  • All Industries

Popular Use Cases

  • Customer Experience
  • Employee Experience
  • Net Promoter Score
  • Voice of Customer
  • Customer Success Hub
  • Product Documentation
  • Training & Certification
  • XM Institute
  • Popular Resources
  • Customer Stories
  • Artificial Intelligence

Market Research

  • Partnerships
  • Marketplace

The annual gathering of the experience leaders at the world’s iconic brands building breakthrough business results, live in Salt Lake City.

  • English/AU & NZ
  • Español/Europa
  • Español/América Latina
  • Português Brasileiro
  • REQUEST DEMO
  • Experience Management
  • Qualitative Research Interviews

Try Qualtrics for free

How to carry out great interviews in qualitative research.

11 min read An interview is one of the most versatile methods used in qualitative research. Here’s what you need to know about conducting great qualitative interviews.

What is a qualitative research interview?

Qualitative research interviews are a mainstay among q ualitative research techniques, and have been in use for decades either as a primary data collection method or as an adjunct to a wider research process. A qualitative research interview is a one-to-one data collection session between a researcher and a participant. Interviews may be carried out face-to-face, over the phone or via video call using a service like Skype or Zoom.

There are three main types of qualitative research interview – structured, unstructured or semi-structured.

  • Structured interviews Structured interviews are based around a schedule of predetermined questions and talking points that the researcher has developed. At their most rigid, structured interviews may have a precise wording and question order, meaning that they can be replicated across many different interviewers and participants with relatively consistent results.
  • Unstructured interviews Unstructured interviews have no predetermined format, although that doesn’t mean they’re ad hoc or unplanned. An unstructured interview may outwardly resemble a normal conversation, but the interviewer will in fact be working carefully to make sure the right topics are addressed during the interaction while putting the participant at ease with a natural manner.
  • Semi-structured interviews Semi-structured interviews are the most common type of qualitative research interview, combining the informality and rapport of an unstructured interview with the consistency and replicability of a structured interview. The researcher will come prepared with questions and topics, but will not need to stick to precise wording. This blended approach can work well for in-depth interviews.

Free eBook: The qualitative research design handbook

What are the pros and cons of interviews in qualitative research?

As a qualitative research method interviewing is hard to beat, with applications in social research, market research, and even basic and clinical pharmacy. But like any aspect of the research process, it’s not without its limitations. Before choosing qualitative interviewing as your research method, it’s worth weighing up the pros and cons.

Pros of qualitative interviews:

  • provide in-depth information and context
  • can be used effectively when their are low numbers of participants
  • provide an opportunity to discuss and explain questions
  • useful for complex topics
  • rich in data – in the case of in-person or video interviews , the researcher can observe body language and facial expression as well as the answers to questions

Cons of qualitative interviews:

  • can be time-consuming to carry out
  • costly when compared to some other research methods
  • because of time and cost constraints, they often limit you to a small number of participants
  • difficult to standardize your data across different researchers and participants unless the interviews are very tightly structured
  • As the Open University of Hong Kong notes, qualitative interviews may take an emotional toll on interviewers

Qualitative interview guides

Semi-structured interviews are based on a qualitative interview guide, which acts as a road map for the researcher. While conducting interviews, the researcher can use the interview guide to help them stay focused on their research questions and make sure they cover all the topics they intend to.

An interview guide may include a list of questions written out in full, or it may be a set of bullet points grouped around particular topics. It can prompt the interviewer to dig deeper and ask probing questions during the interview if appropriate.

Consider writing out the project’s research question at the top of your interview guide, ahead of the interview questions. This may help you steer the interview in the right direction if it threatens to head off on a tangent.

research interview phone

Avoid bias in qualitative research interviews

According to Duke University , bias can create significant problems in your qualitative interview.

  • Acquiescence bias is common to many qualitative methods, including focus groups. It occurs when the participant feels obliged to say what they think the researcher wants to hear. This can be especially problematic when there is a perceived power imbalance between participant and interviewer. To counteract this, Duke University’s experts recommend emphasizing the participant’s expertise in the subject being discussed, and the value of their contributions.
  • Interviewer bias is when the interviewer’s own feelings about the topic come to light through hand gestures, facial expressions or turns of phrase. Duke’s recommendation is to stick to scripted phrases where this is an issue, and to make sure researchers become very familiar with the interview guide or script before conducting interviews, so that they can hone their delivery.

What kinds of questions should you ask in a qualitative interview?

The interview questions you ask need to be carefully considered both before and during the data collection process. As well as considering the topics you’ll cover, you will need to think carefully about the way you ask questions.

Open-ended interview questions – which cannot be answered with a ‘yes’ ‘no’ or ‘maybe’ – are recommended by many researchers as a way to pursue in depth information.

An example of an open-ended question is “What made you want to move to the East Coast?” This will prompt the participant to consider different factors and select at least one. Having thought about it carefully, they may give you more detailed information about their reasoning.

A closed-ended question , such as “Would you recommend your neighborhood to a friend?” can be answered without too much deliberation, and without giving much information about personal thoughts, opinions and feelings.

Follow-up questions can be used to delve deeper into the research topic and to get more detail from open-ended questions. Examples of follow-up questions include:

  • What makes you say that?
  • What do you mean by that?
  • Can you tell me more about X?
  • What did/does that mean to you?

As well as avoiding closed-ended questions, be wary of leading questions. As with other qualitative research techniques such as surveys or focus groups, these can introduce bias in your data. Leading questions presume a certain point of view shared by the interviewer and participant, and may even suggest a foregone conclusion.

An example of a leading question might be: “You moved to New York in 1990, didn’t you?” In answering the question, the participant is much more likely to agree than disagree. This may be down to acquiescence bias or a belief that the interviewer has checked the information and already knows the correct answer.

Other leading questions involve adjectival phrases or other wording that introduces negative or positive connotations about a particular topic. An example of this kind of leading question is: “Many employees dislike wearing masks to work. How do you feel about this?” It presumes a positive opinion and the participant may be swayed by it, or not want to contradict the interviewer.

Harvard University’s guidelines for qualitative interview research add that you shouldn’t be afraid to ask embarrassing questions – “if you don’t ask, they won’t tell.” Bear in mind though that too much probing around sensitive topics may cause the interview participant to withdraw. The Harvard guidelines recommend leaving sensitive questions til the later stages of the interview when a rapport has been established.

More tips for conducting qualitative interviews

Observing a participant’s body language can give you important data about their thoughts and feelings. It can also help you decide when to broach a topic, and whether to use a follow-up question or return to the subject later in the interview.

Be conscious that the participant may regard you as the expert, not themselves. In order to make sure they express their opinions openly, use active listening skills like verbal encouragement and paraphrasing and clarifying their meaning to show how much you value what they are saying.

Remember that part of the goal is to leave the interview participant feeling good about volunteering their time and their thought process to your research. Aim to make them feel empowered , respected and heard.

Unstructured interviews can demand a lot of a researcher, both cognitively and emotionally. Be sure to leave time in between in-depth interviews when scheduling your data collection to make sure you maintain the quality of your data, as well as your own well-being .

Recording and transcribing interviews

Historically, recording qualitative research interviews and then transcribing the conversation manually would have represented a significant part of the cost and time involved in research projects that collect qualitative data.

Fortunately, researchers now have access to digital recording tools, and even speech-to-text technology that can automatically transcribe interview data using AI and machine learning. This type of tool can also be used to capture qualitative data from qualitative research (focus groups,ect.) making this kind of social research or market research much less time consuming.

research interview phone

Data analysis

Qualitative interview data is unstructured, rich in content and difficult to analyze without the appropriate tools. Fortunately, machine learning and AI can once again make things faster and easier when you use qualitative methods like the research interview.

Text analysis tools and natural language processing software can ‘read’ your transcripts and voice data and identify patterns and trends across large volumes of text or speech. They can also perform khttps://www.qualtrics.com/experience-management/research/sentiment-analysis/

which assesses overall trends in opinion and provides an unbiased overall summary of how participants are feeling.

research interview phone

Another feature of text analysis tools is their ability to categorize information by topic, sorting it into groupings that help you organize your data according to the topic discussed.

All in all, interviews are a valuable technique for qualitative research in business, yielding rich and detailed unstructured data. Historically, they have only been limited by the human capacity to interpret and communicate results and conclusions, which demands considerable time and skill.

When you combine this data with AI tools that can interpret it quickly and automatically, it becomes easy to analyze and structure, dovetailing perfectly with your other business data. An additional benefit of natural language analysis tools is that they are free of subjective biases, and can replicate the same approach across as much data as you choose. By combining human research skills with machine analysis, qualitative research methods such as interviews are more valuable than ever to your business.

Related resources

Market intelligence 10 min read, marketing insights 11 min read, ethnographic research 11 min read, qualitative vs quantitative research 13 min read, qualitative research questions 11 min read, qualitative research design 12 min read, primary vs secondary research 14 min read, request demo.

Ready to learn more about Qualtrics?

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Int J Environ Res Public Health

Logo of ijerph

Conducting In-Depth Interviews via Mobile Phone with Persons with Common Mental Disorders and Multimorbidity: The Challenges and Advantages as Experienced by Participants and Researchers

Azadé azad.

1 Department of Psychology, Stockholm University, SE-106 91 Stockholm, Sweden; [email protected]

Elisabet Sernbo

2 Department of Social Work, University of Gothenburg, SE-405 30 Gothenburg, Sweden; [email protected]

Veronica Svärd

3 Department of Social Work, Södertörn University, SE-141 89 Huddinge, Sweden

4 Division of Insurance Medicine, Department of Clinical Neuroscience, Karolinska Institutet, SE-171 77 Stockholm, Sweden

Lisa Holmlund

5 Unit of Intervention and Implementation Research for Worker Health, Institute of Environmental Medicine, Karolinska Institutet, SE-171 77 Stockholm, Sweden; [email protected] (L.H.); [email protected] (E.B.B.)

Elisabeth Björk Brämberg

Associated data.

The data presented in this study are available on request from the authors V.V. and E.B.B. The data are not publicly available due to ethical restrictions.

Qualitative interviews are generally conducted in person. As the coronavirus pandemic (COVID-19) prevents in-person interviews, methodological studies which investigate the use of the telephone for persons with different illness experiences are needed. The aim was to explore experiences of the use of telephone during semi-structured research interviews, from the perspective of participants and researchers. Data were collected from mobile phone interviews with 32 individuals who had common mental disorders or multimorbidity which were analyzed thematically, as well as field notes reflecting researchers’ experiences. The findings reveal several advantages of conducting interviews using mobile phones: flexibility, balanced anonymity and power relations, as well as a positive effect on self-disclosure and emotional display (leading to less emotional work and social responsibility). Challenges included the loss of human encounter, intense listening, and worries about technology, as well as sounds or disturbances in the environment. However, the positive aspects of not seeing each other were regarded as more important. In addition, we present some strategies before, during, and after conducting telephone interviews. Telephone interviews can be a valuable first option for data collection, allowing more individuals to be given a fair opportunity to share their experiences.

1. Introduction

In-depth interviews are one of the most common forms of data gathering in qualitative research [ 1 , 2 ]. The purpose is to obtain information about how individuals view, understand, and make sense of their lives, and how they assign meaning to particular experiences, events, and subjects [ 3 ]. Hence, such interviews are appropriate for exploring phenomena about which we have limited knowledge, or in generating knowledge to inform social or healthcare interventions [ 4 , 5 , 6 , 7 , 8 ].

Qualitative interviews have traditionally been conducted in-person, either individually or in focus groups [ 3 , 5 ]. There seems to be a consensus in the literature that in-person interviews are the best (‘gold standard’) format [ 9 ]. However, they are not always possible due to logistical, practical, or safety reasons, such as the COVID-19 pandemic [ 10 , 11 , 12 ]. The COVID-19 pandemic has produced a wide range of changes in customary practices of conducting research, particularly on the gathering of data [ 13 ]. Researchers, ourselves included, have been forced to use remote methods, such as telephone interviews as a mean of collecting qualitative data. Although proven to be a viable way of data collection [ 14 ], there is still a lack of methodological discussion about the use of telephone interviews for certain groups of participants [ 15 ], such as persons with common mental disorders (CMDs) (i.e., depression, anxiety, adjustment disorders) or multimorbidity. These groups, with symptoms of e.g., exhaustion and bodily aches, have been difficult to recruit to research studies, due to mental distress, medications, stigma, and a reduced capacity to take on new information and thus to consent to participation, for example [ 16 , 17 ]. Telephone interviews might be a well-suited solution for these groups [ 18 ]; however, there are a lack of studies investigating the experiences of telephone interviews from the perspective of people with CMDs and multimorbidity.

Telephone Interview as a Method of Collecting Qualitative Data

Previously, telephone interviews have been used as a last resort for collecting qualitative research data [ 3 , 19 , 20 ]. The most common concerns about telephone interviews are that they might have a negative impact on the richness and quality of the collected information [ 19 ], the challenges in establishing rapport [ 21 , 22 ], and the inability to respond to visual and emotional cues [ 15 ]. Other criticisms involve the increased risk of misunderstandings and the inability to know if and when to ask probing questions or introduce more sensitive topics [ 20 ]. However, a growing body of literature using the telephone as a way of collecting data, as well as studies comparing the use of telephone with in-person interviews, do not find support for the traditionalist view. Rather, scholars make the case for the potential of in-depth telephone interviews as a viable and equivalent option for qualitative research [ 23 ], with some even arguing that they are, in some regards, methodologically superior to in-person interviews [ 24 , 25 ].

Available studies have, for example, shown that telephone interviews generate the same amount of data richness as in-person interviews in terms of word count and topic-related information [ 26 , 27 ], and only modest differences in depth of data [ 28 ], even though telephone interviews tend to be shorter [ 29 ]. One study [ 14 ] found that in-person interviews are more conversational and detailed than remote methods (telephone and Skype), but that they do not clearly lead to differences in interview ratings. Other scholars [ 30 ] state that telephone offer flexibility regarding when and where to conduct the interview [ 24 ], which increase anonymity and reduce distraction (for interviewees), thus improving the information given [ 26 , 31 ]. Several attempts to develop tools improving the success of in-depth telephone interviewing have been made [ 32 , 33 , 34 , 35 ], considering the criticisms raised against telephone interviews, as well as the counter arguments. These tools provide a set of comprehensive approaches to follow before, during, and after the interview to ensure effective use. These emphasize the significance of communicating the importance of participant contribution, explaining the purpose of the study in the early phase of the research either in writing or initial telephone contact, and establishing rapport through small talk when first contacting the participant [ 32 ]. Because of the absence of non-verbal cues and difficulties in identifying visual emotional expressions, the importance of providing verbal feedback and follow up probes are stressed [ 36 ], as well as using vocalizations and clarification to show responsiveness [ 32 ]. Such verbal cues or probed questions can in turn result in both parties listening more carefully [ 30 ].

Studies investigating the use of telephone interviews from the perspective of the interviewee have mostly yielded positive results. For many, telephone interviews are the preferred choice, when given the option to choose [ 25 ], for reasons of convenience and greater anonymity [ 35 , 37 ]. In contrast to traditionalist views, some researchers have found that interviewees find it is easy to establish rapport [ 23 ]. Hence, some authors claim that telephone interviewing is suitable for vulnerable and marginalized populations and more sensitive questions [ 32 , 35 ].

Telephone interviews can also have advantages for the interviewer, by reducing self-consciousness [ 24 ] and bias and stereotyping about the interviewer. It can also benefit the researcher–participant relationship by providing a more balanced power dynamic between the two [ 27 ].

One group of participants who, despite the growing body of literature examining the advantages and challenges of telephone interviews, have not been further investigated, are people with experience of sick leave due to illness, such as CMDs and/or multimorbidity. It has been argued that there are specific challenges in interviewing people with mental illnesses and barriers having to do with the consequences of their symptoms (such as mental distress, medications, stigma, reduced ability to take in new information, and passive interaction with healthcare professionals) [ 16 , 17 , 38 ]. Research has also shown that recent illness or present ill health affect research participation negatively, and using telephone interviews has been suggested as a way of enhancing response rate [ 18 ]. Including the experiences of people who are or have been on sick leave due to CMDs or multimorbidity in research is critical, due to, for example, the individual and societal burden. However, in doing so, the interview situation must be adapted to suit the participants needs. This may be provided by conducting telephone interviews.

The aim of the present study is, therefore, to explore the use of the telephone for semi-structured interviews from the perspective of these individuals. A further aim is to address the challenges and advantages of using the telephone from the perspective of the interviewer. To the best of our knowledge, there are no previous methodological studies into the use of telephone interviews with individuals with CMD or multimorbidity. Our study is, therefore, a unique contribution to the scarce research available on this topic.

2. Materials and Methods

2.1. study design.

This study used a qualitative approach involving semi-structured interviews with people with CMD or multimorbidity with on-going sick leave, or who had returned to work after sick leave. The interviews reflect the participants’ unique experiences regarding the use of mobile phone when collecting data. The participants are included in two different projects (see Table 1 ). In these projects, in-person interviews were changed to telephone interviews because of the COVID-19 pandemic. This study focuses on the last part of the interviews where probes were added to take account the participants experience of being interviewed by mobile phone. We primarily refer to mobile phones, as ownership of mobile phone is generally, and in Sweden in particular, much higher than landline ownership [ 39 ]. Both participants and researchers used mobile phones during the interviews.

Information about the overall aim of respective project and study, recruitment, and procedure.

RECO = The rehabilitation coordinator project; PROSA = A problem solving intervention in primary health care aimed at reducing sick leave among people suffering from common mental disorders – a cluster-randomized trial; SA = sickness absence; CMD = common mental disorders; RTW = return to work; I, II, III = refers to the three different projects in which data was collected from.

2.2. Participants

Participants were recruited from two projects: the RECO-project [ 40 , 41 ] and the PROSA-project [ 42 ] (see Table 1 ). All participants were given written and/or oral information by post about the study, including that participation was voluntary. In the RECO-project, 70 individuals received written information, of whom 13 replied that they were interested in participating. One person later declined to participate because their knowledge of the investigated subject in the particular project was limited. In one of the PROSA-projects, 49 individuals were given oral information about the study. Of those, 18 received written information and agreed to be contacted by the researcher. Of these, 10 took part in an interview. In the other study linked to the PROSA-project, 15 participants were contacted by telephone by the researcher for information. Of these, three did not answer, one did not fit eligibility criteria, one declined to participate, and 10 were included in the present study.

In total, 32 participants were included in this study. Twelve participants were on sick leave due to multimorbidity, and twenty were on sick leave or had recently returned to work after sick leave due to CMDs. The participants represent a variety in ages (ages ranged from 22 to 62) and gender, although a majority were women (7 men and 25 women) and type of employment. For more detailed information about the participants, see Table 2 .

Sociodemographic characteristics of the participants ( n = 32).

2.3. Data Collection

Data were gathered through semi-structured mobile phone interviews with the participants and field notes kept by the researchers. The interviews were conducted between March and September 2020. The interviews followed interview guides with primary questions specifically for each project, and follow-up probes about being interviewed by telephone. Only the data relating to telephone interviewing are included in the present study. The probes addressed the participants’ experience of the conducted telephone interviews, including the challenges and advantages of being interviewed over the telephone. The participants were also asked to reflect over possible alternative modes of interviews (such as in-person or internet-based methods). Their reflections are not to be understood as direct comparisons between the use of different research methodologies, as they only partook in telephone interviews and not internet-based, or in-person interview methods. Rather, the participants experiences are to be understood as unique reflections on being interviewed using mobile phones. During the interviews, the participants reflect on experiences of meeting professionals in-person and/or working with different technologies.

Interviews ranged in length from about 30 to 90 min which included the whole interview. Three members of the research team (first, second and fourth author) conducted the interviews. All members of the research team were experienced in conducting in-depth in-person interviews, and some had also previous experience of conducting telephone interviews. Interviews were digitally recorded and transcribed verbatim in Swedish. The transcripts and digital recordings were cross-checked.

The data also consist of field notes [ 43 ] with reflections upon our experience as researchers conducting in-depth in-person and telephone interviews as a means of data collection. The field notes were written down directly after every phone call. Each interviewer noted their immediate recollection of the conversation, summarizing how they experienced the interview format and content as well as their reflections about the interview generally.

2.4. Data Analysis

Thematic analysis [ 44 , 45 ] was conducted to explore participants’ views of participating in qualitative interviews by telephone. We began our analysis by reading through the transcribed text to familiarize ourselves with the material and search for patterns in the data. We then identified important and interesting features focusing on the semantic and latent meanings in line with the aim. These features included words, sentences, or paragraphs relating to what the participants found difficult or easy with being interviewed over telephone, and were then condensed and assigned a code. The third step involved searching for possible themes, by identifying and coding them across participants. This step was performed on the first 22 interviews collected and refocused the analysis at the broader level of themes, rather than codes, and involved sorting the codes into potential themes and collating all the relevant coded data extracts within these themes. The first and second author made a first draft of the themes and the remaining researchers read through and discussed them. This discussion involved reviewing and refining themes, both with regard to each theme in itself and in relation to the data set. The ten remaining transcripts were analyzed based on the drafted themes and used to check for depth in the analysis. No new themes were added and the initial themes were adjusted until the conceptual depth in the themes was agreed upon [ 46 ]. A final step involved rechecking the data to code additional codes that may have been missed, before refining and defining the essence of each theme by naming them. During the analysis process, the coding and themes were repeatedly discussed by all the researchers until consensus was reached. During the analysis process, the first author translated the themes and quotes from Swedish into English and the second and the fourth authors reviewed the translations, before all the authors made a final revision.

The field notes are understood as condensed rather than transcribed, and were jointly discussed and elaborated, inspired by notions on how the written record and memory interact [ 47 ]. Our reflections based on these field notes are analyzed and presented separately from the analysis of the participants’ narratives. This analysis was inspired by thematic analysis, although not following Braun and Clarke’s [ 44 , 45 ] six steps.

3. Results and Discussion

The findings are presented in three themes, including discussion in relation to relevant research: flexibility of location, personal well-being and emotional ease, and balancing anonymity and social responsibility. The themes reflect patterns of meaning relating to the experiences of being interviewed over the mobile phone. They are not hierarchical in relation to one another but rather presuppose each other; one enables the other while being on the same analytical level. After presenting the three themes, the researchers’ experiences and reflections are offered and discussed in relation to the themes.

3.1. Flexibility of Location

The first theme had to do with practical and environmental aspects, such as the flexibility to choose place and surrounding during the mobile phone interview, compared to landline phone or in-person options. The flexibility of using mobile phones meant that the participants were free to choose place for the interview, and did not have to physically meet the interviewer. Most participants conducted the interviews from home, and a few from their workplace—geographically close and familiar environments. Not having to spend time or energy travelling was of great importance for the majority of the participants. The time saved in telephone interview compared to in-person was, for some participants, crucial for participation. For example, one participant said:

It’s also nice to be at home and not have to go to an interview and so on, because that would use so much energy. Then maybe I would choose not to do it. (Female, 38 years, multimorbidity)

Although these benefits—for both participants and researchers—have been identified in previous research [ 24 , 26 ], our results point to the importance of flexibility, both regarding geography and time for this group of participants specifically. As their mental and/or physical health makes it difficult for them to travel, telephone interviews offer a way of participating without having to do so.

Flexibility was also associated with the specificity of the mobile phone rather than other choices of technology, for example internet-based voice options, such as Skype or Zoom. While some thought that internet-based video options were desirable because of the ability to see each other, the vast majority preferred the mobile phone option. As one participant said:

I would also have worried about the [internet-based] technology, I have to say, it’s probably inevitable that you do to some degree. (Female, 34 years, stress syndrome)

Using the mobile phone, however, added no extra technical demands for the participant and, therefore, meant limited technical worries before and/or during the interview. Some used internet-based technology at work, but others had no experience of such tools and said they would have been worried about coping with the technology. This is in line with Seitz’s [ 48 ] reasoning that technical difficulties may have a negative impact on the interview. For our participants, in contrast to what other researchers have purposed, Sipes et al.’s [ 49 ] voice-only options are not always the equal option to using mobile phones.

3.2. Personal Well-Being and Emotional Ease

In personal and emotional terms, using mobile phone rather than in-person interviews was seen as helping the participants’ well-being and emotional ease. Suffering from CMD and/or multimorbidity was already perceived as demanding by the participants. In comparison to an in-person interview or internet-video based options, the mobile phone interview not only enabled them to choose place and surrounding for the interview, but also position and the ability to move around while talking. Some participants appreciated the ability to conduct the interview via mobile phone while having a walk outside, which had not been possible using landline phone. Being physically comfortable and free was highly valued, given that the participants had symptoms of CMDs and/or multimorbidity with depression, exhaustion, and bodily aches. In line with Cachia and Millward’s findings [ 24 ], our participants reported being less self-conscious while not having to think about how to sit or conform to social cues and norms as in an in-person or video-based meeting.

Being able to do the interview over the telephone caused less anxiety and was less emotionally demanding. This is described by one of the participants:

There’s a lot of fear and stress, and talking about these things can make it, since it’s so personal, I get scared of being judged and looking someone in the eye, seeing them react in a negative way about something that has… You can’t see that on the phone. (Female, 50 years, multimorbidity)

Other emotional advantages had to do with feeling less inhibited when not being able to see each other. For some, this meant being able to talk more freely; for others, it meant displaying more emotions such as crying. For example, one participant said that it was easier to continue talking even though she had been crying, because the interviewer may not even have noticed. The telephone was experienced as providing a positive sense of protection when sharing. As one participant put it:

When you get an anxiety attack, or, I don’t know how to put it, but like, you feel kind of protected behind the phone. (Male, 25 years, depression)

In this regard, conducting the interviews over the telephone led to fewer emotions being visible, so it was easier to cry than when meeting someone in person. For some participants, the less emotion work demanded by telephone interviews was a precondition for participation. These findings reinforce those of previous studies [ 37 , 50 ], showing that some participants regarded the telephone interview as the ‘only option’ for them being able to participate at all. This suggests that telephone interviews can increase participation and, thus, the heterogeneity and breadth of the data. In particular, it seems to be crucial for being able to involve some of the most vulnerable groups, i.e., those with limited energy and an ability to participate in an in-person interview due to mental or physical illness. As such groups have been outlined as hard to recruit for research studies [ 16 ], our results point to that telephone interviews might help overcome the challenges in interviewing people with for example CMDs and/or multimorbidity. Using the telephone can simply be considered as an easier way to participate in research interviews, by placing less demands on the participant compared to video options or face-to-face interviews.

These findings also relate to how telephone interviews reduce participants’ emotion work in accordance to Hochschild [ 51 ], because they do not visibly convey and manage their feelings in the social interaction. Goffman [ 52 ] argues that people strive to convey their feelings in a socially acceptable way and manage their emotional expressions and impressions. By removing the visible dimensions of social interaction, and giving participants the opportunity to be ‘protected behind the phone’, the emotion work is not completely removed from the interaction, but the conditions are changed because participants can maintain the desired anonymity and emotional distance. The telephone interview, compared with in-person interviews, allows interviewees to shed an unseen tear, lie down without anybody knowing, and keep visible emotions private. The freedom offered by these choices, together with the flexibility and time- and energy-saving aspects discussed earlier, suggest that telephone interviews allow participants to share their experiences while putting less strain on them as they do so.

3.3. Balancing Anonymity and Social Responsibility

The third theme focuses more explicitly on the relational aspects of the mobile phone interview. The physical distance, with the participant and interviewer unable to see each other, did not only make it easier to protect your emotional expressions, but also created a sense of anonymity, making it easier to talk about sensitive subjects. As one of the participants put it:

It gets very personal, these are very personal things to talk about … and I don’t know you. So then it can be nice to have this little bit of distance. (Male, 46 years, depression)

The sense of freedom related to the ability to choose the level of intimacy in the interview, unique to the telephone mode, thus contributing to a sense of anonymity and psychological distance. This also made it more likely for interviewees to feel comfortable talking about sensitive subjects [ 25 , 37 ]. The perceived higher degree of anonymity might result in richer data and a higher validity among responses, as the telephone mode could decrease social desirability. For example, avoiding being seen by an in-person or video-based interviewer can create a feeling of being less judged and not being in the gaze of the professional [ 25 ]. Telephone interviews can thus lead to a more balanced power dynamic between the participant and the interviewer [ 27 ]. The feeling of distance was also described as making it easier to take control and end a conversation which may not have felt good or right.

Telephone interviews required less social responsibility since participants were able to focus solely on what the other person was saying instead of thinking about social cues and norms as in an in-person meeting (such as where to look, how to sit, when to nod or smile, and so on). Goffman [ 52 ] uses the term impression management to discuss how people put on performances during in-person social interactions in order to manage, rather than show, their feelings. Our findings suggest that the telephone interview may ease the burden on the participants to put on a performance, as they do not have to think about their body language, or relate to social clues or norms to the same extent as in an in-person interview.

The downside of this form of interview was the required intense listening, which is described as somewhat demanding by some participants. Receiving fewer cues via visual interaction is, thus, described as a balancing act, as some participants stressed the importance of the interviewer keeping the conversation on track, not leaving them unsure about whether or not they are talking about the ‘right’ things. They mentioned the importance of the interviewer’s voice, both in relation to being able to understand the other and being understood. For example, participants described finding it easy to ‘get a feeling’ for the other person through the tone of voice instead of through the other social cues used when you are sitting face to face. As one of the participants explained, the way the interviewer spoke, referring to the tone of voice, helped to install confidence. Although verbalization has been stressed in telephone interviews [ 32 ], our finding adds to research by stressing the importance of not only what is being said but also how it is said. The importance of tone and attribute of the interviewers’ voice is, thus, a crucial tool to use within in-depth telephone interviewing.

When talking about the negative aspects of telephone interviews, the participants also mentioned several factors in the first contact and impression of an in-person meeting. For example, they mentioned that it is interesting and fun to meet new people and that it is nice to see the other person. This was often linked to curiosity and ‘the human encounter’. Negative aspects of not being able to see each other were also described to affect interactions:

Not that I find it difficult, but if you’re sitting together, in a way you have another kind of interplay because you can see one and other. (Female, 46 years, multimorbidity)

However, because they viewed this interview as a one-off and were not going to have a further relationship with the person interviewing them, the positive aspects of not seeing each other were regarded as more important. As they explained, they were first and foremost interested in conveying their experiences. Some also reported that they were able to create their own image of the interviewer, which filled the same function as an in-person meeting.

3.4. Researchers’ Experiences and Reflections

The analysis of the researchers’ experiences and field notes resulted in two themes having to do with worries and challenges about the technology and relational and social aspects , as well as a third overarching theme of understanding the telephone as a ‘shield’ . Quotations from our field notes are provided for each theme in order to illustrate and contextualize the results. Regarding the first theme, worries and challenges about the technology , the researchers reflected on that the mobile phone interview was sometimes imbued with worries and challenges about the technology used, for example not being able to control the quality of participants’ network coverage or mobile equipment. Using mobile phone can, therefore, involve more challenges regarding technology compared to using landline phone. Moreover, the participants´ choice of environment in some cases meant disturbances that challenged the researchers´ sense of being able to control the interview. The possible negative impact on the interview if, for example, the interviewees network coverage was insufficient, or if there were disturbances in the physical or social environment is illustrated by this reflection:

The first time I call him, he is in his car, and we agree that I can call again in 15 min, when he has arrived home. At the beginning of the interview, it is somewhat difficult because he has not found a friend for his son to play with [as he had hoped] and he is a bit hesitant related to what he can do to occupy his son. I offer to reschedule, but he wants to do the interview and starts a movie [that his son can watch during the interview]. (Written by L.H. The quote refers to a male participant, 45 years, stress syndrome)

The participants being in a situation where they can decide if they want to wash their dishes or take a stroll while talking in their mobile phone can leave the researcher experiencing loss of power over the situation. This disadvantage for the researcher can be an advantage for the participant, showing that using the telephone for interviewing involves giving away some power over the situation to the interviewee. Whale [ 50 ] points to the loss of power for the researcher, in interviewing over Skype or telephone, as something that enables a more balanced power dynamic between the interviewer and the interviewee. Our findings show that using a mobile phone further expands the freedom for the participants, and inevitably means a redistribution of power from the researcher to the participants. At the same time, the interviewer controls most elements in the interview, such as the topics discussed [ 53 ]. The redistribution of power can, therefore, be both welcomed and challenging.

Regarding other aspects having to do with the theme relational and social aspects , we also reflected on how the participants´ sense of emotional ease contrasted with the researchers’ feelings of being less able to recognize and respond to the participants´ emotions and states of mind. A lack of visible feedback meant a need to use the voice and the language more consciously to convey understanding and show interest in the participant’s unique experiences.

I can hear that she is sad. I tell her this and say something confirmatory. I emphasize that it is ok to take a break if she wants to. (Written by E.S. The quote refers to a female participant, 38 years, stress syndrome)

In an in-person interview or video-based option, it is possible to non-verbally assure participants that their stories are ‘on track’, or show sympathy and understanding, in order to not disrupt them. In a telephone interview, however, the nod of the head must be made audible, all the while avoiding interrupting the interviewee. For the interviewer, this involves a clear shift from the non-verbal feedback style to the audible.

She is crying, which she had hinted might happen the first time that we talked. I tell her that we can take a break or end the interview if needed. Not seeing the other person makes it more difficult for me to decide whether to continue or not. I must trust her. It is apparent that the verbal response becomes more important when someone is showing emotion. (Written by A.A. The quote refers to a female participant, 35 years, multimorbidity)

An advantage, however, was that the format of the telephone interview seemed to enrich the participants’ stories. For example, the participants themselves conveyed that being behind the telephone acted as a ‘shield’ , which, in a sense, allowed them to more easily express themselves, and we reflected over the openness and details in the participants’ stories. For some, the possibility to choose their level of emotional closeness or distance meant that they were more comfortable talking about sensitive subjects.

I am surprised to see that their stories have a flow to them, that they have shared openly. They also reflect on this themselves, that the anonymity allows an openness. (Written by L.H.)

4. Reflections and Strategies for Conducting Telephone Interviews—Before, during and after

The results point to the importance of telephone interviews by decreasing emotional demands put on the participants, focusing the importance of anonymity and social responsibility, and providing the participants with the freedom to choose level of intimacy, but also contributing to research despite dealing with symptoms. Although the ongoing COVID-19 pandemic obliged us, as researchers, to conduct interviews by phone, some participants regarded the mobile phone option as a crucial factor which enabled them to participate in a research interview. These results are important to address in future studies, because the participants—often struggling with symptoms such as pain, exhaustion, or anxiety—had to spend less energy on paying attention to social cues and norms, and could instead focus on how to reveal their personal experiences.

More information about the informal insights derived from qualitative interviews as a means for data has been called for [ 33 ]. Our findings highlight challenges, advantages, and possible strategies which can be useful (1) when preparing the interview, (2) during the interview, and (3) after the interview. These strategies are relevant for all telephone interviews with participants where some are particularly important for the study group, i.e., participants struggling with symptoms such as pain, exhaustion, or anxiety.

When preparing the interview, our findings indicate the importance of a first introductory call to familiarize the interviewer and interviewee with each other and discuss how the interview will be carried out. This entails telling participants that they should preferably be able to talk freely without distraction, and that silence during the interview should be interpreted as active listening from an interviewer who does not want to disturb their stories. This introductory conversation is to prepare the participant for the particular form of dialog that a telephone interview is, but it also serves to establish rapport. In other words, it is a way of ‘getting to know each other’ without seeing each other, rather than clarifying the use of the voice as well as silences. This is important for building trust between the interviewee and the interviewer in line with the recommendation of, for example, Drabble et al. [ 32 ]. We also found that it was important for the interviewer to convey to the participant his or her understanding of the circumstances that are central to the subject of the interview—in our case, their health status and work disability. This suggests that the potential of the method is related to the interviewer being sufficiently familiar with the research topic and the specific kind of difficulties the participants are facing. This can be an important factor for validating the participant during the interview and building a trusting relationship over the telephone, as we were not able to do so using visual cues. As all participants used mobile phones, we found it necessary to encourage participants to choose a place where they have good reception and minimal background noise, especially important when using mobile phone compared to landline phone. This can prevent problems arising during the telephone interview and allay the researcher’s own worries beforehand. The researcher too must choose a space with good reception and check that the recording equipment is working properly.

During the interviews, we found that verbalization was important for communicating the reason for silences (e.g., taking notes or giving time for the participant to continue talking). Communicating responses was also important (e.g., saying ‘please continue’, ‘do you need to take a break’, or giving short summaries of what had been said). In addition, the tone of voice was found to be another important tool for conveying interest and understanding, as well as establishing confidence. Further, we found that asking participants about their experience of being interviewed over the telephone was a good way of ending the interview, which primarily was about their experiences of being on sick leave. This smoothly closed the main story, allowed the participants to be brought back to the present and gave them the power of being experts in their own experience of the interview situation.

After the interviews, we found it important to gather our own reflections and experience of the interview by writing summaries of our overall impressions and making field notes about our experience of the interviewing situation as well as the main findings in relation to the questions asked. These field notes were valuable tools for evaluating or supplementing the data and they were used as data for the researcher’s reflections in the findings [ 43 ]. As we did not have to spend any time traveling to or from the interviews, we were able to carry out this post-interview part of the procedure more effectively, directly after the interview. Completing the interviews from home or the workplace for us as researchers also meant that we could secure the data in an effective way, i.e., we could save the recording in a secure manner immediately after the interview was over.

5. Methodological Considerations

There are a lack of methodological studies which investigate the use of telephone interviews with individuals with CMD and/or multimorbidity, where this study contributes to the gap in the literature. The strategic sampling of participants, with a diversity in demographic characteristics and viewpoints, facilitates the provision of a rich data set [ 54 ]. Yet, transferability of findings from qualitative studies may be limited to other groups or settings. To allow for judgment of transferability to other groups or setting, the authors strived to provide detailed descriptions of study design and clear communication of the findings. Although some of the findings are specifically related to the participants symptoms from their CMDs and/or multimorbidity, they may also be transferable to other groups which may not have a diagnosis but do experience the same type of symptoms or difficulties.

A limitation with the study is that the participants in general did not have experience of in-person or internet-based research interviews and that we did not have a comparison group who conducted the interviews in person or via internet-based option. However, as our purpose was not to compare the different formats but rather to gather knowledge on the experience of the telephone interviews from the perspective of participants, this was also beyond our scope. One might also want to consider how the presence of a third person during the interviews could have constrained the participants’ responses; however, we do not have information about the presence of other people, besides children being present during the interviews. Furthermore, in cases where the participants were in public, we rescheduled interviews to a better suited time and setting.

6. Conclusions

To conclude, telephone interviews are a method with both advantages and challenges. They provide more anonymity which seem to have a positive effect on self-disclosure and emotional display, while making fewer demands of participants in terms of emotion work and social responsibility. However, the shift from nonverbal to the audible put higher demands on the use of voice and require more intense listening on both parts. Worries about the quality of the interview due to difficulties with technology and sound or disturbances in the environment are also challenges presented as well as the loss of human encounter. Using telephone interviews as a means of qualitative data collection balance the power relationship between the interviewer and the interviewee, which can be demanding for the interviewer but beneficial for those being interviewed. The advantages, which were deemed as more important than the challenges, may give a certain group of individuals (e.g., those with CMDs or multimorbidity) a fairer opportunity to participate in research projects and share their experiences. Telephone interviews can be regarded as a valuable first option if the purpose of the study is not to build a relationship over time or observe visual cues, but rather about how people experience their lives.

Acknowledgments

We are very grateful to the participants for sharing their stories with us.

Author Contributions

Conceptualization, A.A., E.S., V.S., L.H. and E.B.B.; methodology, A.A., E.S., V.S. and E.B.B.; validation, A.A., E.S., V.S., L.H. and E.B.B.; formal analysis, A.A., E.S., V.S., L.H. and E.B.B.; investigation, A.A., E.S., V.S., L.H. and E.B.B.; resources, V.S., L.H. and E.B.B.; data curation, A.A., E.S., V.S., L.H. and E.B.B.; writing—original draft preparation, A.A.; writing—review and editing, A.A., E.S., V.S., L.H. and E.B.B.; supervision, V.S. and E.B.B.; project administration, V.S. and E.B.B.; funding acquisition, V.S., L.H. and E.B.B. All authors have read and agreed to the published version of the manuscript.

This research was financially supported by grants from The Swedish Research Council for Health, Working Life and Welfare (FORTE) (Grant No. 2018-01252), AFA-Insurance (Dnr. 199221), and The Kamprad Family Foundation (Reference No 20190271).

Institutional Review Board Statement

All procedures performed in the present study were in accordance with the ethical standards of the Swedish Ethical Review Authority and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. The projects included in the present study were approved by the Swedish Ethical Review Authority (No 2020-00403; 2020-02462; 496-17, amendment T039-18).

Informed Consent Statement

Written informed consent was obtained from all subjects involved in the study. The written consent included publication of anonymized responses.

Data Availability Statement

Conflicts of interest.

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

How to conduct qualitative interviews (tips and best practices)

Last updated

18 May 2023

Reviewed by

Miroslav Damyanov

However, conducting qualitative interviews can be challenging, even for seasoned researchers. Poorly conducted interviews can lead to inaccurate or incomplete data, significantly compromising the validity and reliability of your research findings.

When planning to conduct qualitative interviews, you must adequately prepare yourself to get the most out of your data. Fortunately, there are specific tips and best practices that can help you conduct qualitative interviews effectively.

  • What is a qualitative interview?

A qualitative interview is a research technique used to gather in-depth information about people's experiences, attitudes, beliefs, and perceptions. Unlike a structured questionnaire or survey, a qualitative interview is a flexible, conversational approach that allows the interviewer to delve into the interviewee's responses and explore their insights and experiences.

In a qualitative interview, the researcher typically develops a set of open-ended questions that provide a framework for the conversation. However, the interviewer can also adapt to the interviewee's responses and ask follow-up questions to understand their experiences and views better.

  • How to conduct interviews in qualitative research

Conducting interviews involves a well-planned and deliberate process to collect accurate and valid data. 

Here’s a step-by-step guide on how to conduct interviews in qualitative research, broken down into three stages:

1. Before the interview

The first step in conducting a qualitative interview is determining your research question . This will help you identify the type of participants you need to recruit . Once you have your research question, you can start recruiting participants by identifying potential candidates and contacting them to gauge their interest in participating in the study. 

After that, it's time to develop your interview questions. These should be open-ended questions that will elicit detailed responses from participants. You'll also need to get consent from the participants, ideally in writing, to ensure that they understand the purpose of the study and their rights as participants. Finally, choose a comfortable and private location to conduct the interview and prepare the interview guide.

2. During the interview

Start by introducing yourself and explaining the purpose of the study. Establish a rapport by putting the participants at ease and making them feel comfortable. Use the interview guide to ask the questions, but be flexible and ask follow-up questions to gain more insight into the participants' responses. 

Take notes during the interview, and ask permission to record the interview for transcription purposes. Be mindful of the time, and cover all the questions in the interview guide.

3. After the interview

Once the interview is over, transcribe the interview if you recorded it. If you took notes, review and organize them to make sure you capture all the important information. Then, analyze the data you collected by identifying common themes and patterns. Use the findings to answer your research question. 

Finally, debrief with the participants to thank them for their time, provide feedback on the study, and answer any questions they may have.

Free AI content analysis generator

Make sense of your research by automatically summarizing key takeaways through our free content analysis tool.

research interview phone

  • What kinds of questions should you ask in a qualitative interview?

Qualitative interviews involve asking questions that encourage participants to share their experiences, opinions, and perspectives on a particular topic. These questions are designed to elicit detailed and nuanced responses rather than simple yes or no answers.

Effective questions in a qualitative interview are generally open-ended and non-leading. They avoid presuppositions or assumptions about the participant's experience and allow them to share their views in their own words. 

In customer research , you might ask questions such as:

What motivated you to choose our product/service over our competitors?

How did you first learn about our product/service?

Can you walk me through your experience with our product/service?

What improvements or changes would you suggest for our product/service?

Have you recommended our product/service to others, and if so, why?

The key is to ask questions relevant to the research topic and allow participants to share their experiences meaningfully and informally. 

  • How to determine the right qualitative interview participants

Choosing the right participants for a qualitative interview is a crucial step in ensuring the success and validity of the research . You need to consider several factors to determine the right participants for a qualitative interview. These may include:

Relevant experiences : Participants should have experiences related to the research topic that can provide valuable insights.

Diversity : Aim to include diverse participants to ensure the study's findings are representative and inclusive.

Access : Identify participants who are accessible and willing to participate in the study.

Informed consent : Participants should be fully informed about the study's purpose, methods, and potential risks and benefits and be allowed to provide informed consent.

You can use various recruitment methods, such as posting ads in relevant forums, contacting community organizations or social media groups, or using purposive sampling to identify participants who meet specific criteria.

  • How to make qualitative interview subjects comfortable

Making participants comfortable during a qualitative interview is essential to obtain rich, detailed data. Participants are more likely to share their experiences openly when they feel at ease and not judged. 

Here are some ways to make interview subjects comfortable:

Explain the purpose of the study

Start the interview by explaining the research topic and its importance. The goal is to give participants a sense of what to expect.

Create a comfortable environment

Conduct the interview in a quiet, private space where the participant feels comfortable. Turn off any unnecessary electronics that can create distractions. Ensure your equipment works well ahead of time. Arrive at the interview on time. If you conduct a remote interview, turn on your camera and mute all notetakers and observers.

Build rapport

Greet the participant warmly and introduce yourself. Show interest in their responses and thank them for their time.

Use open-ended questions

Ask questions that encourage participants to elaborate on their thoughts and experiences.

Listen attentively

Resist the urge to multitask . Pay attention to the participant's responses, nod your head, or make supportive comments to show you’re interested in their answers. Avoid interrupting them.

Avoid judgment

Show respect and don't judge the participant's views or experiences. Allow the participant to speak freely without feeling judged or ridiculed.

Offer breaks

If needed, offer breaks during the interview, especially if the topic is sensitive or emotional.

Creating a comfortable environment and establishing rapport with the participant fosters an atmosphere of trust and encourages open communication. This helps participants feel at ease and willing to share their experiences.

  • How to analyze a qualitative interview

Analyzing a qualitative interview involves a systematic process of examining the data collected to identify patterns, themes, and meanings that emerge from the responses. 

Here are some steps on how to analyze a qualitative interview:

1. Transcription

The first step is transcribing the interview into text format to have a written record of the conversation. This step is essential to ensure that you can refer back to the interview data and identify the important aspects of the interview.

2. Data reduction

Once you’ve transcribed the interview, read through it to identify key themes, patterns, and phrases emerging from the data. This process involves reducing the data into more manageable pieces you can easily analyze.

The next step is to code the data by labeling sections of the text with descriptive words or phrases that reflect the data's content. Coding helps identify key themes and patterns from the interview data.

4. Categorization

After coding, you should group the codes into categories based on their similarities. This process helps to identify overarching themes or sub-themes that emerge from the data.

5. Interpretation

You should then interpret the themes and sub-themes by identifying relationships, contradictions, and meanings that emerge from the data. Interpretation involves analyzing the themes in the context of the research question .

6. Comparison

The next step is comparing the data across participants or groups to identify similarities and differences. This step helps to ensure that the findings aren’t just specific to one participant but can be generalized to the wider population.

7. Triangulation

To ensure the findings are valid and reliable, you should use triangulation by comparing the findings with other sources, such as observations or interview data.

8. Synthesis

The final step is synthesizing the findings by summarizing the key themes and presenting them clearly and concisely. This step involves writing a report that presents the findings in a way that is easy to understand, using quotes and examples from the interview data to illustrate the themes.

  • Tips for transcribing a qualitative interview

Transcribing a qualitative interview is a crucial step in the research process. It involves converting the audio or video recording of the interview into written text. 

Here are some tips for transcribing a qualitative interview:

Use transcription software

Transcription software can save time and increase accuracy by automatically transcribing audio or video recordings.

Listen carefully

When manually transcribing, listen carefully to the recording to ensure clarity. Pause and rewind the recording as necessary.

Use appropriate formatting

Use a consistent format for transcribing, such as marking pauses, overlaps, and interruptions. Indicate non-verbal cues such as laughter, sighs, or changes in tone.

Edit for clarity

Edit the transcription to ensure clarity and readability. Use standard grammar and punctuation, correct misspellings, and remove filler words like "um" and "ah."

Proofread and edit

Verify the accuracy of the transcription by listening to the recording again and reviewing the notes taken during the interview.

Use timestamps

Add timestamps to the transcription to reference specific interview sections.

Transcribing a qualitative interview can be time-consuming, but it’s essential to ensure the accuracy of the data collected. Following these tips can produce high-quality transcriptions useful for analysis and reporting.

  • Why are interview techniques in qualitative research effective?

Unlike quantitative research methods, which rely on numerical data, qualitative research seeks to understand the richness and complexity of human experiences and perspectives. 

Interview techniques involve asking open-ended questions that allow participants to express their views and share their stories in their own words. This approach can help researchers to uncover unexpected or surprising insights that may not have been discovered through other research methods.

Interview techniques also allow researchers to establish rapport with participants, creating a comfortable and safe space for them to share their experiences. This can lead to a deeper level of trust and candor, leading to more honest and authentic responses.

  • What are the weaknesses of qualitative interviews?

Qualitative interviews are an excellent research approach when used properly, but they have their drawbacks. 

The weaknesses of qualitative interviews include the following:

Subjectivity and personal biases

Qualitative interviews rely on the researcher's interpretation of the interviewee's responses. The researcher's biases or preconceptions can affect how the questions are framed and how the responses are interpreted, which can influence results.

Small sample size

The sample size in qualitative interviews is often small, which can limit the generalizability of the results to the larger population.

Data quality

The quality of data collected during interviews can be affected by various factors, such as the interviewee's mood, the setting of the interview, and the interviewer's skills and experience.

Socially desirable responses

Interviewees may provide responses that they believe are socially acceptable rather than truthful or genuine.

Conducting qualitative interviews can be expensive, especially if the researcher must travel to different locations to conduct the interviews.

Time-consuming

The data analysis process can be time-consuming and labor-intensive, as researchers need to transcribe and analyze the data manually.

Despite these weaknesses, qualitative interviews remain a valuable research tool . You can take steps to mitigate the impact of these weaknesses by incorporating the perspectives of other researchers or participants in the analysis process, using multiple data sources , and critically analyzing your biases and assumptions.

Mastering the art of qualitative interviews is an essential skill for businesses looking to gain deep insights into their customers' needs , preferences, and behaviors. By following the tips and best practices outlined in this article, you can conduct interviews that provide you with rich data that you can use to make informed decisions about your products, services, and marketing strategies. 

Remember that effective communication, active listening, and proper analysis are critical components of successful qualitative interviews. By incorporating these practices into your customer research, you can gain a competitive edge and build stronger customer relationships.

Should you be using a customer insights hub?

Do you want to discover previous research faster?

Do you share your research findings with others?

Do you analyze research data?

Start for free today, add your research, and get to key insights faster

Editor’s picks

Last updated: 11 January 2024

Last updated: 15 January 2024

Last updated: 17 January 2024

Last updated: 25 November 2023

Last updated: 12 May 2023

Last updated: 30 April 2024

Last updated: 13 May 2024

Latest articles

Related topics, .css-je19u9{-webkit-align-items:flex-end;-webkit-box-align:flex-end;-ms-flex-align:flex-end;align-items:flex-end;display:-webkit-box;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-flex-direction:row;-ms-flex-direction:row;flex-direction:row;-webkit-box-flex-wrap:wrap;-webkit-flex-wrap:wrap;-ms-flex-wrap:wrap;flex-wrap:wrap;-webkit-box-pack:center;-ms-flex-pack:center;-webkit-justify-content:center;justify-content:center;row-gap:0;text-align:center;max-width:671px;}@media (max-width: 1079px){.css-je19u9{max-width:400px;}.css-je19u9>span{white-space:pre;}}@media (max-width: 799px){.css-je19u9{max-width:400px;}.css-je19u9>span{white-space:pre;}} decide what to .css-1kiodld{max-height:56px;display:-webkit-box;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-align-items:center;-webkit-box-align:center;-ms-flex-align:center;align-items:center;}@media (max-width: 1079px){.css-1kiodld{display:none;}} build next, decide what to build next.

research interview phone

Users report unexpectedly high data usage, especially during streaming sessions.

research interview phone

Users find it hard to navigate from the home page to relevant playlists in the app.

research interview phone

It would be great to have a sleep timer feature, especially for bedtime listening.

research interview phone

I need better filters to find the songs or artists I’m looking for.

Log in or sign up

Get started for free

Telephone Interviewing as a Qualitative Methodology for Researching Cyberinfrastructure and Virtual Organizations

  • Reference work entry
  • First Online: 10 October 2019
  • Cite this reference work entry

research interview phone

  • Kerk F. Kee 4 &
  • Andrew R. Schrock 4  

2053 Accesses

7 Citations

Cyberinfrastructure (CI) involves networked technologies, organizational practices, and human workers that enable computationally intensive, data-driven, and multidisciplinary collaborations on large-scale scientific problems. CI enables emerging forms of mediated relationships, dispersed groups, virtual organizations, and distributed communities. Researchers of CI often employ a limited set of methodologies such as trace data analysis and ethnography. In response, this chapter proposes a more flexible framework of interviewing members of dispersed groups, virtual organizations, and distributed communities whose work, interaction, and communication are primarily mediated by communication technologies. Telephone interviewing can yield high-quality data under appropriate conditions, making it a productive mode of data collection comparable to a face-to-face mode. The protocol described in this chapter for telephone interviews has been refined over three studies (total N  = 236) and 10 years (2007–2017) of research. The protocol has been shown to be a flexible and effective way to collect qualitative data on practices, networks, projects, and biographical histories in the virtual CI communities under study. These benefits speak to a need in CI research to expand from case studies and sited ethnographies. Telephone interviewing is a valuable addition to the growing literature on CI methodologies. Furthermore, our framework can be used as a pedagogical tool for training students interested in qualitative research.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Atkins DE et al (2003) Revolutionizing science and engineering through cyberinfrastructure: report of the National Science Foundation Blue-Ribbon Advisory Panel on cyberinfrastructure. National Science Foundation, Washington, DC

Google Scholar  

Baxter LA, Babbie ER (2003) The basics of communication research. Wadsworth, Belmont

Berry JM (2002) Validity and reliability issues in elite interviewing. PS 35:679–682

Browning LD, Morris GH, Kee KF (2011) The role of communication in positive organizational scholarship. In: Cameron KS, Spreitzer GM (eds) Oxford handbook of positive organizational scholarship. Oxford University Press, Oxford, UK

Brummans BH (2014) Pathways to mindful qualitative organizational communication research. Manag Commun Q 28:440–447. https://doi.org/10.1177/0893318914535286

Article   Google Scholar  

Chen L (1997) Verbal adaptive strategies in US American dyadic interactions with US American or East-Asian partners. Commun Monogr 64:302–323

Cialdini RB (2009) Influence: science and practice, 5th edn. Pearson Education, Boston

Fenig S, Levav I, Kohn R, Yelin N (1993) Telephone vs face-to-face interviewing in a Community Psychiatric Survey. Am J Public Health 83:896–898

Glaser BG, Strauss AL (1967) The discovery of grounded theory: strategies for qualitative research. Aldine de Gruyter, Hawthorne

Gratton M-F, O’Donnell S (2011) Communication technologies for focus groups with remote communities: a case study of research with First Nations in Canada. Qual Res 11:159–175. https://doi.org/10.1177/1468794110394068

Greenfield TK, Midanik LT, Rogers JD (2000) Effects of telephone versus face-to-face interview modes on reports of alcohol consumption. Addiction 95:277–284

Groves RM, Kahn RL (1980) Surveys by telephone: a national comparison with personal interviews. Academic, New York

Hall ET (1989) Beyond culture. Random House, New York

Harvey WS (2011) Strategies for conducting elite interviews. Qual Res 11:431–441

Holbrook AL, Green MC, Krosnick JA (2003) Telephone versus face-to-face interviewing of national probability samples with long questionnaires: comparisons of respondent satisficing and social desirability response bias. Public Opin Q 67:79–125. https://doi.org/10.1086/346010

Holt A (2010) Using the telephone for narrative interviewing: a research note. Qual Res 10:113–121. https://doi.org/10.1177/1468794109348686

Karasti H, Baker KS, Millerand F (2010) Infrastructure time: long-term matters in collaborative development. Comput Supported Coop Work 19:377–415. https://doi.org/10.1007/s10606-010-9113-z

Kee KF (2017) The 10 adoption drivers of open source software that enable e-research in data factories for open innovation. In: Matei S, Julien N, Goggins S (eds) Big data factories, computational sciences: collaborative approaches. Springer, New York, pp 51–65

Chapter   Google Scholar  

Kee K, Browning LD (2010) The dialectical tensions in the funding infrastructure of cyberinfrastructure. Comput Supported Coop Work 19:283–308

Kee K, Cradduck L, Blodgett B, Olwan R (2011) Cyberinfrastructure inside out: definition and influences shaping its emergence, development, and implementation in the early 21st century. In: Araya D, Breindl Y, Houghton TJ (eds) Nexus: new intersections in internet research. Peter Lang, New York, pp 157–189

Kerlinger FN, Lee HB (1999) Foundations of behavioral research. Wadsworth Publishing, New York

Kvale S (1996) Interviews: an introduction to qualitative research. Sage, Thousand Okas

Lee CP, Dourish P, Mark G (2006) The human infrastructure of cyberinfrastructure. ACM Press, New York

Lindlof T (2002) Qualitative communication research methods. Sage, Thousand Oaks

McPherson M, Smith-Lovin L, Cook JM (2001) Birds of a feather: homophily in social networks. Annu Rev Sociol 27:415–444

Meyer ET, Schroeder R (2015) Knowledge machines: digital transformations of the sciences and humanities. MIT Press, Cambridge, MA

Book   Google Scholar  

Newman HM (1982) The sounds of silence in communicative encounters. Commun Q 30:142–149

Ribes D, Finholt TA (2008) Representing community: knowing users in the face of changing constituencies. In: 2008 ACM conference on Computer supported cooperative work (CSCW) San Diego, pp 107–116. https://doi.org/10.1145/1460563.1460581

Ribes D, Lee CP (2010) Sociotechnical studies of cyberinfrastructure and e-research: current themes and future trajectories. Comput Supported Coop Work 19:231–244. https://doi.org/10.1007/s10606-010-9120-0

Rubin HJ, Rubin IS (1995) Qualitative interviewing: the art of hearing data. Sage, Thousand Oaks

Shrum W, Genuth J, Chompalov I (2007) Structures of scientific collaboration. MIT Press, Cambridge, MA

Shuy RW (2002) In-person versus telephone interviewing. In: Gubrium JF, Holstein JA (eds) Handbook of interview research: context and method. Sage, Thousand Oaks, pp 537–555

Stephens KK (2007a) The successive use of information and communication technologies at work. Commun Theory 17:486–507. https://doi.org/10.1111/j.1468-2885.2007.00308.x

Stephens N (2007b) Collecting data from elites and ultra elites: telephone and face-to-face interviews with macroeconomists. Qual Res 7:203–216

Sturges JE, Hanrahan KJ (2004a) Comparing telephone and face-to-face qualitative interviewing: a research note. Qual Res 4:107–118

Sturges JE, Hanrahan KJ (2004b) Comparing telephone and face-to-face qualitative interviewing: a research note. Qual Res 4:107. https://doi.org/10.1177/1468794104041110

Tracy SJ (2010) Qualitative quality: eight “big-tent” criteria for excellent qualitative research. Qual Inq 16:837–851. https://doi.org/10.1177/1077800410383121

Tracy SJ, Eger EK, Huffman TP, Redden SM, Scarduzio JA (2014) Narrating the backstage of qualitative research in organizational communication: a synthesis. Manag Commun Q 28:422–431. https://doi.org/10.1177/0893318914536964

Travers M (2009) New methods, old problems: a sceptical view of innovation in qualitative research. Qual Res 9:161–179. https://doi.org/10.1177/1468794108095079

Walther JB (1996) Computer-mediated communication: impersonal, interpersonal, and hyperpersonal interaction. Commun Res 23:3–43

Download references

Acknowledgments

The authors thank Larry Browning for his support and early contribution to Study 1 documented in this chapter. Studies 2 and 3 are funded by NSF ACI 1322305 and NSF ACI 1453864 respectively

Author information

Authors and affiliations.

School of Communication, Chapman University, Orange, CA, USA

Kerk F. Kee & Andrew R. Schrock

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Kerk F. Kee .

Editor information

Editors and affiliations.

Communication Studies, Wilfrid Laurier University, Waterloo, ON, Canada

Jeremy Hunsinger

Deakin University, Burwood, VIC, Australia

Matthew M. Allen

IT University of Copenhagen, Copenhagen, Denmark

Lisbeth Klastrup

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature B.V.

About this entry

Cite this entry.

Kee, K.F., Schrock, A.R. (2020). Telephone Interviewing as a Qualitative Methodology for Researching Cyberinfrastructure and Virtual Organizations. In: Hunsinger, J., Allen, M., Klastrup, L. (eds) Second International Handbook of Internet Research. Springer, Dordrecht. https://doi.org/10.1007/978-94-024-1555-1_52

Download citation

DOI : https://doi.org/10.1007/978-94-024-1555-1_52

Published : 10 October 2019

Publisher Name : Springer, Dordrecht

Print ISBN : 978-94-024-1553-7

Online ISBN : 978-94-024-1555-1

eBook Packages : Social Sciences Reference Module Humanities and Social Sciences Reference Module Business, Economics and Social Sciences

Share this entry

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

Social Science Research Methods

  • Research Fundamentals
  • Qualitative Data Analysis

Phone Interviews

research interview phone

When you think about phone interviews, things like marketing and polling are the first scenarios that come to mind. This preconception invites skepticism about phone interviewing as a research method, especially among academics. However, a phone call can be a useful way to conduct structured interviews. In recent years, an increasing amount of research has utilized phone calls as a means to collect open-ended data. There have even been ethnographic studies done completely over the phone. Here we will cover some advantages and disadvantages of collecting interview data over the phone and explore in greater detail some effective strategies for conducting phone interviews.

Bias Check!

Phone interviews can sometimes get lost between the old, reliable face-to-face technique and the new and exciting online interview. Many researchers tend to view phone research as inferior to other forms of gathering interview data, and have low expectations for the quality of phone data ( Novick, 2008 ). In fact, some researchers have expressed the opinion that phone interviews correlate with highly structured, closed-ended questions and therefore do not have the ability to generate the natural responses elicited in face-to-face interviews (Novick, 2008). In a 2008 analysis of phone interviews in qualitative research, Gina Novick notes that there is not enough evidence against phone interviews to view them as less legitimate than face-to-face interviews. Therefore, as you continue reading this section on phone interviews, try to refrain from any bias you might hold against the phone as a means of gathering interview data. Instead, evaluate its usefulness to your particular research project based on the advantages and disadvantages laid out below.

Conducting an interview over the phone :

Conducting an interview over the phone also entails different collection strategies than face-to-face interview. Before going into the interviews, think about how you will record the data. You could simply put the call on speaker and record the conversation. Or, you could record using computer software and a microphone. Either way, be sure to also take physical notes in case the quality of the recording is lacking or the call breaks up. Be sure to test out all the technology you are using ahead of time and find a place with limited background noise. When you schedule interviews with your participants, it is a good idea to mention that they should also find a space to talk with little background noise or chance of interruption. It can be easiest to communicate via email to address pre-interview concerns and give an overview of the research. However, you should tailor your participant recruitment approach based on your target population ( Farooq & De Villiers, 2017 ). For example, you may want to send written invitations or recruit by phone depending on your target population.

During the interview itself, remember that the participants can’t see your face and are thus not receiving any visual cues. Because of this, your tone is especially important and you should be conscious of the verbal feedback you are giving participants. Indicate your presence and interest by acknowledging what the participant has said when they are done talking (Farooq & De Villiers, 2017). While a simple nod during face-to-face interviews to indicate interest would suffice, this must instead be done via verbal acknowledgement in a phone interview. 

For many people, phone calls can feel more stressful than talking face-to-face. Keep this in mind throughout the interview and be sure to create a comfortable environment for your participants. This might mean taking extra time to introduce yourself at the beginning of the conversation, continuing to use a light and casual tone as you speak, or offering opportunities to ask questions at the beginning or end of the call.

  • Please give us feedback on this specific page! This helps us improve our site. Below you can tell us if our content is unclear, confusing, or incomplete. You can tell us if our site functions are confusing, broken, or have other issues. You can also inform us of any edits or proofreading we missed. If there are any other issues or concerns with this page, please let us know!
  • Issues with content
  • Issues with functionality
  • Issues with editing/proofreading
  • Please explain your answer(s) above. You can copy and paste text from the page to highlight specific issues.
  • Hidden page URL
  • Hidden page title

css.php

Logo for Open Educational Resources

Chapter 11. Interviewing

Introduction.

Interviewing people is at the heart of qualitative research. It is not merely a way to collect data but an intrinsically rewarding activity—an interaction between two people that holds the potential for greater understanding and interpersonal development. Unlike many of our daily interactions with others that are fairly shallow and mundane, sitting down with a person for an hour or two and really listening to what they have to say is a profound and deep enterprise, one that can provide not only “data” for you, the interviewer, but also self-understanding and a feeling of being heard for the interviewee. I always approach interviewing with a deep appreciation for the opportunity it gives me to understand how other people experience the world. That said, there is not one kind of interview but many, and some of these are shallower than others. This chapter will provide you with an overview of interview techniques but with a special focus on the in-depth semistructured interview guide approach, which is the approach most widely used in social science research.

An interview can be variously defined as “a conversation with a purpose” ( Lune and Berg 2018 ) and an attempt to understand the world from the point of view of the person being interviewed: “to unfold the meaning of peoples’ experiences, to uncover their lived world prior to scientific explanations” ( Kvale 2007 ). It is a form of active listening in which the interviewer steers the conversation to subjects and topics of interest to their research but also manages to leave enough space for those interviewed to say surprising things. Achieving that balance is a tricky thing, which is why most practitioners believe interviewing is both an art and a science. In my experience as a teacher, there are some students who are “natural” interviewers (often they are introverts), but anyone can learn to conduct interviews, and everyone, even those of us who have been doing this for years, can improve their interviewing skills. This might be a good time to highlight the fact that the interview is a product between interviewer and interviewee and that this product is only as good as the rapport established between the two participants. Active listening is the key to establishing this necessary rapport.

Patton ( 2002 ) makes the argument that we use interviews because there are certain things that are not observable. In particular, “we cannot observe feelings, thoughts, and intentions. We cannot observe behaviors that took place at some previous point in time. We cannot observe situations that preclude the presence of an observer. We cannot observe how people have organized the world and the meanings they attach to what goes on in the world. We have to ask people questions about those things” ( 341 ).

Types of Interviews

There are several distinct types of interviews. Imagine a continuum (figure 11.1). On one side are unstructured conversations—the kind you have with your friends. No one is in control of those conversations, and what you talk about is often random—whatever pops into your head. There is no secret, underlying purpose to your talking—if anything, the purpose is to talk to and engage with each other, and the words you use and the things you talk about are a little beside the point. An unstructured interview is a little like this informal conversation, except that one of the parties to the conversation (you, the researcher) does have an underlying purpose, and that is to understand the other person. You are not friends speaking for no purpose, but it might feel just as unstructured to the “interviewee” in this scenario. That is one side of the continuum. On the other side are fully structured and standardized survey-type questions asked face-to-face. Here it is very clear who is asking the questions and who is answering them. This doesn’t feel like a conversation at all! A lot of people new to interviewing have this ( erroneously !) in mind when they think about interviews as data collection. Somewhere in the middle of these two extreme cases is the “ semistructured” interview , in which the researcher uses an “interview guide” to gently move the conversation to certain topics and issues. This is the primary form of interviewing for qualitative social scientists and will be what I refer to as interviewing for the rest of this chapter, unless otherwise specified.

Types of Interviewing Questions: Unstructured conversations, Semi-structured interview, Structured interview, Survey questions

Informal (unstructured conversations). This is the most “open-ended” approach to interviewing. It is particularly useful in conjunction with observational methods (see chapters 13 and 14). There are no predetermined questions. Each interview will be different. Imagine you are researching the Oregon Country Fair, an annual event in Veneta, Oregon, that includes live music, artisan craft booths, face painting, and a lot of people walking through forest paths. It’s unlikely that you will be able to get a person to sit down with you and talk intensely about a set of questions for an hour and a half. But you might be able to sidle up to several people and engage with them about their experiences at the fair. You might have a general interest in what attracts people to these events, so you could start a conversation by asking strangers why they are here or why they come back every year. That’s it. Then you have a conversation that may lead you anywhere. Maybe one person tells a long story about how their parents brought them here when they were a kid. A second person talks about how this is better than Burning Man. A third person shares their favorite traveling band. And yet another enthuses about the public library in the woods. During your conversations, you also talk about a lot of other things—the weather, the utilikilts for sale, the fact that a favorite food booth has disappeared. It’s all good. You may not be able to record these conversations. Instead, you might jot down notes on the spot and then, when you have the time, write down as much as you can remember about the conversations in long fieldnotes. Later, you will have to sit down with these fieldnotes and try to make sense of all the information (see chapters 18 and 19).

Interview guide ( semistructured interview ). This is the primary type employed by social science qualitative researchers. The researcher creates an “interview guide” in advance, which she uses in every interview. In theory, every person interviewed is asked the same questions. In practice, every person interviewed is asked mostly the same topics but not always the same questions, as the whole point of a “guide” is that it guides the direction of the conversation but does not command it. The guide is typically between five and ten questions or question areas, sometimes with suggested follow-ups or prompts . For example, one question might be “What was it like growing up in Eastern Oregon?” with prompts such as “Did you live in a rural area? What kind of high school did you attend?” to help the conversation develop. These interviews generally take place in a quiet place (not a busy walkway during a festival) and are recorded. The recordings are transcribed, and those transcriptions then become the “data” that is analyzed (see chapters 18 and 19). The conventional length of one of these types of interviews is between one hour and two hours, optimally ninety minutes. Less than one hour doesn’t allow for much development of questions and thoughts, and two hours (or more) is a lot of time to ask someone to sit still and answer questions. If you have a lot of ground to cover, and the person is willing, I highly recommend two separate interview sessions, with the second session being slightly shorter than the first (e.g., ninety minutes the first day, sixty minutes the second). There are lots of good reasons for this, but the most compelling one is that this allows you to listen to the first day’s recording and catch anything interesting you might have missed in the moment and so develop follow-up questions that can probe further. This also allows the person being interviewed to have some time to think about the issues raised in the interview and go a little deeper with their answers.

Standardized questionnaire with open responses ( structured interview ). This is the type of interview a lot of people have in mind when they hear “interview”: a researcher comes to your door with a clipboard and proceeds to ask you a series of questions. These questions are all the same whoever answers the door; they are “standardized.” Both the wording and the exact order are important, as people’s responses may vary depending on how and when a question is asked. These are qualitative only in that the questions allow for “open-ended responses”: people can say whatever they want rather than select from a predetermined menu of responses. For example, a survey I collaborated on included this open-ended response question: “How does class affect one’s career success in sociology?” Some of the answers were simply one word long (e.g., “debt”), and others were long statements with stories and personal anecdotes. It is possible to be surprised by the responses. Although it’s a stretch to call this kind of questioning a conversation, it does allow the person answering the question some degree of freedom in how they answer.

Survey questionnaire with closed responses (not an interview!). Standardized survey questions with specific answer options (e.g., closed responses) are not really interviews at all, and they do not generate qualitative data. For example, if we included five options for the question “How does class affect one’s career success in sociology?”—(1) debt, (2) social networks, (3) alienation, (4) family doesn’t understand, (5) type of grad program—we leave no room for surprises at all. Instead, we would most likely look at patterns around these responses, thinking quantitatively rather than qualitatively (e.g., using regression analysis techniques, we might find that working-class sociologists were twice as likely to bring up alienation). It can sometimes be confusing for new students because the very same survey can include both closed-ended and open-ended questions. The key is to think about how these will be analyzed and to what level surprises are possible. If your plan is to turn all responses into a number and make predictions about correlations and relationships, you are no longer conducting qualitative research. This is true even if you are conducting this survey face-to-face with a real live human. Closed-response questions are not conversations of any kind, purposeful or not.

In summary, the semistructured interview guide approach is the predominant form of interviewing for social science qualitative researchers because it allows a high degree of freedom of responses from those interviewed (thus allowing for novel discoveries) while still maintaining some connection to a research question area or topic of interest. The rest of the chapter assumes the employment of this form.

Creating an Interview Guide

Your interview guide is the instrument used to bridge your research question(s) and what the people you are interviewing want to tell you. Unlike a standardized questionnaire, the questions actually asked do not need to be exactly what you have written down in your guide. The guide is meant to create space for those you are interviewing to talk about the phenomenon of interest, but sometimes you are not even sure what that phenomenon is until you start asking questions. A priority in creating an interview guide is to ensure it offers space. One of the worst mistakes is to create questions that are so specific that the person answering them will not stray. Relatedly, questions that sound “academic” will shut down a lot of respondents. A good interview guide invites respondents to talk about what is important to them, not feel like they are performing or being evaluated by you.

Good interview questions should not sound like your “research question” at all. For example, let’s say your research question is “How do patriarchal assumptions influence men’s understanding of climate change and responses to climate change?” It would be worse than unhelpful to ask a respondent, “How do your assumptions about the role of men affect your understanding of climate change?” You need to unpack this into manageable nuggets that pull your respondent into the area of interest without leading him anywhere. You could start by asking him what he thinks about climate change in general. Or, even better, whether he has any concerns about heatwaves or increased tornadoes or polar icecaps melting. Once he starts talking about that, you can ask follow-up questions that bring in issues around gendered roles, perhaps asking if he is married (to a woman) and whether his wife shares his thoughts and, if not, how they negotiate that difference. The fact is, you won’t really know the right questions to ask until he starts talking.

There are several distinct types of questions that can be used in your interview guide, either as main questions or as follow-up probes. If you remember that the point is to leave space for the respondent, you will craft a much more effective interview guide! You will also want to think about the place of time in both the questions themselves (past, present, future orientations) and the sequencing of the questions.

Researcher Note

Suggestion : As you read the next three sections (types of questions, temporality, question sequence), have in mind a particular research question, and try to draft questions and sequence them in a way that opens space for a discussion that helps you answer your research question.

Type of Questions

Experience and behavior questions ask about what a respondent does regularly (their behavior) or has done (their experience). These are relatively easy questions for people to answer because they appear more “factual” and less subjective. This makes them good opening questions. For the study on climate change above, you might ask, “Have you ever experienced an unusual weather event? What happened?” Or “You said you work outside? What is a typical summer workday like for you? How do you protect yourself from the heat?”

Opinion and values questions , in contrast, ask questions that get inside the minds of those you are interviewing. “Do you think climate change is real? Who or what is responsible for it?” are two such questions. Note that you don’t have to literally ask, “What is your opinion of X?” but you can find a way to ask the specific question relevant to the conversation you are having. These questions are a bit trickier to ask because the answers you get may depend in part on how your respondent perceives you and whether they want to please you or not. We’ve talked a fair amount about being reflective. Here is another place where this comes into play. You need to be aware of the effect your presence might have on the answers you are receiving and adjust accordingly. If you are a woman who is perceived as liberal asking a man who identifies as conservative about climate change, there is a lot of subtext that can be going on in the interview. There is no one right way to resolve this, but you must at least be aware of it.

Feeling questions are questions that ask respondents to draw on their emotional responses. It’s pretty common for academic researchers to forget that we have bodies and emotions, but people’s understandings of the world often operate at this affective level, sometimes unconsciously or barely consciously. It is a good idea to include questions that leave space for respondents to remember, imagine, or relive emotional responses to particular phenomena. “What was it like when you heard your cousin’s house burned down in that wildfire?” doesn’t explicitly use any emotion words, but it allows your respondent to remember what was probably a pretty emotional day. And if they respond emotionally neutral, that is pretty interesting data too. Note that asking someone “How do you feel about X” is not always going to evoke an emotional response, as they might simply turn around and respond with “I think that…” It is better to craft a question that actually pushes the respondent into the affective category. This might be a specific follow-up to an experience and behavior question —for example, “You just told me about your daily routine during the summer heat. Do you worry it is going to get worse?” or “Have you ever been afraid it will be too hot to get your work accomplished?”

Knowledge questions ask respondents what they actually know about something factual. We have to be careful when we ask these types of questions so that respondents do not feel like we are evaluating them (which would shut them down), but, for example, it is helpful to know when you are having a conversation about climate change that your respondent does in fact know that unusual weather events have increased and that these have been attributed to climate change! Asking these questions can set the stage for deeper questions and can ensure that the conversation makes the same kind of sense to both participants. For example, a conversation about political polarization can be put back on track once you realize that the respondent doesn’t really have a clear understanding that there are two parties in the US. Instead of asking a series of questions about Republicans and Democrats, you might shift your questions to talk more generally about political disagreements (e.g., “people against abortion”). And sometimes what you do want to know is the level of knowledge about a particular program or event (e.g., “Are you aware you can discharge your student loans through the Public Service Loan Forgiveness program?”).

Sensory questions call on all senses of the respondent to capture deeper responses. These are particularly helpful in sparking memory. “Think back to your childhood in Eastern Oregon. Describe the smells, the sounds…” Or you could use these questions to help a person access the full experience of a setting they customarily inhabit: “When you walk through the doors to your office building, what do you see? Hear? Smell?” As with feeling questions , these questions often supplement experience and behavior questions . They are another way of allowing your respondent to report fully and deeply rather than remain on the surface.

Creative questions employ illustrative examples, suggested scenarios, or simulations to get respondents to think more deeply about an issue, topic, or experience. There are many options here. In The Trouble with Passion , Erin Cech ( 2021 ) provides a scenario in which “Joe” is trying to decide whether to stay at his decent but boring computer job or follow his passion by opening a restaurant. She asks respondents, “What should Joe do?” Their answers illuminate the attraction of “passion” in job selection. In my own work, I have used a news story about an upwardly mobile young man who no longer has time to see his mother and sisters to probe respondents’ feelings about the costs of social mobility. Jessi Streib and Betsy Leondar-Wright have used single-page cartoon “scenes” to elicit evaluations of potential racial discrimination, sexual harassment, and classism. Barbara Sutton ( 2010 ) has employed lists of words (“strong,” “mother,” “victim”) on notecards she fans out and asks her female respondents to select and discuss.

Background/Demographic Questions

You most definitely will want to know more about the person you are interviewing in terms of conventional demographic information, such as age, race, gender identity, occupation, and educational attainment. These are not questions that normally open up inquiry. [1] For this reason, my practice has been to include a separate “demographic questionnaire” sheet that I ask each respondent to fill out at the conclusion of the interview. Only include those aspects that are relevant to your study. For example, if you are not exploring religion or religious affiliation, do not include questions about a person’s religion on the demographic sheet. See the example provided at the end of this chapter.

Temporality

Any type of question can have a past, present, or future orientation. For example, if you are asking a behavior question about workplace routine, you might ask the respondent to talk about past work, present work, and ideal (future) work. Similarly, if you want to understand how people cope with natural disasters, you might ask your respondent how they felt then during the wildfire and now in retrospect and whether and to what extent they have concerns for future wildfire disasters. It’s a relatively simple suggestion—don’t forget to ask about past, present, and future—but it can have a big impact on the quality of the responses you receive.

Question Sequence

Having a list of good questions or good question areas is not enough to make a good interview guide. You will want to pay attention to the order in which you ask your questions. Even though any one respondent can derail this order (perhaps by jumping to answer a question you haven’t yet asked), a good advance plan is always helpful. When thinking about sequence, remember that your goal is to get your respondent to open up to you and to say things that might surprise you. To establish rapport, it is best to start with nonthreatening questions. Asking about the present is often the safest place to begin, followed by the past (they have to know you a little bit to get there), and lastly, the future (talking about hopes and fears requires the most rapport). To allow for surprises, it is best to move from very general questions to more particular questions only later in the interview. This ensures that respondents have the freedom to bring up the topics that are relevant to them rather than feel like they are constrained to answer you narrowly. For example, refrain from asking about particular emotions until these have come up previously—don’t lead with them. Often, your more particular questions will emerge only during the course of the interview, tailored to what is emerging in conversation.

Once you have a set of questions, read through them aloud and imagine you are being asked the same questions. Does the set of questions have a natural flow? Would you be willing to answer the very first question to a total stranger? Does your sequence establish facts and experiences before moving on to opinions and values? Did you include prefatory statements, where necessary; transitions; and other announcements? These can be as simple as “Hey, we talked a lot about your experiences as a barista while in college.… Now I am turning to something completely different: how you managed friendships in college.” That is an abrupt transition, but it has been softened by your acknowledgment of that.

Probes and Flexibility

Once you have the interview guide, you will also want to leave room for probes and follow-up questions. As in the sample probe included here, you can write out the obvious probes and follow-up questions in advance. You might not need them, as your respondent might anticipate them and include full responses to the original question. Or you might need to tailor them to how your respondent answered the question. Some common probes and follow-up questions include asking for more details (When did that happen? Who else was there?), asking for elaboration (Could you say more about that?), asking for clarification (Does that mean what I think it means or something else? I understand what you mean, but someone else reading the transcript might not), and asking for contrast or comparison (How did this experience compare with last year’s event?). “Probing is a skill that comes from knowing what to look for in the interview, listening carefully to what is being said and what is not said, and being sensitive to the feedback needs of the person being interviewed” ( Patton 2002:374 ). It takes work! And energy. I and many other interviewers I know report feeling emotionally and even physically drained after conducting an interview. You are tasked with active listening and rearranging your interview guide as needed on the fly. If you only ask the questions written down in your interview guide with no deviations, you are doing it wrong. [2]

The Final Question

Every interview guide should include a very open-ended final question that allows for the respondent to say whatever it is they have been dying to tell you but you’ve forgotten to ask. About half the time they are tired too and will tell you they have nothing else to say. But incredibly, some of the most honest and complete responses take place here, at the end of a long interview. You have to realize that the person being interviewed is often discovering things about themselves as they talk to you and that this process of discovery can lead to new insights for them. Making space at the end is therefore crucial. Be sure you convey that you actually do want them to tell you more, that the offer of “anything else?” is not read as an empty convention where the polite response is no. Here is where you can pull from that active listening and tailor the final question to the particular person. For example, “I’ve asked you a lot of questions about what it was like to live through that wildfire. I’m wondering if there is anything I’ve forgotten to ask, especially because I haven’t had that experience myself” is a much more inviting final question than “Great. Anything you want to add?” It’s also helpful to convey to the person that you have the time to listen to their full answer, even if the allotted time is at the end. After all, there are no more questions to ask, so the respondent knows exactly how much time is left. Do them the courtesy of listening to them!

Conducting the Interview

Once you have your interview guide, you are on your way to conducting your first interview. I always practice my interview guide with a friend or family member. I do this even when the questions don’t make perfect sense for them, as it still helps me realize which questions make no sense, are poorly worded (too academic), or don’t follow sequentially. I also practice the routine I will use for interviewing, which goes something like this:

  • Introduce myself and reintroduce the study
  • Provide consent form and ask them to sign and retain/return copy
  • Ask if they have any questions about the study before we begin
  • Ask if I can begin recording
  • Ask questions (from interview guide)
  • Turn off the recording device
  • Ask if they are willing to fill out my demographic questionnaire
  • Collect questionnaire and, without looking at the answers, place in same folder as signed consent form
  • Thank them and depart

A note on remote interviewing: Interviews have traditionally been conducted face-to-face in a private or quiet public setting. You don’t want a lot of background noise, as this will make transcriptions difficult. During the recent global pandemic, many interviewers, myself included, learned the benefits of interviewing remotely. Although face-to-face is still preferable for many reasons, Zoom interviewing is not a bad alternative, and it does allow more interviews across great distances. Zoom also includes automatic transcription, which significantly cuts down on the time it normally takes to convert our conversations into “data” to be analyzed. These automatic transcriptions are not perfect, however, and you will still need to listen to the recording and clarify and clean up the transcription. Nor do automatic transcriptions include notations of body language or change of tone, which you may want to include. When interviewing remotely, you will want to collect the consent form before you meet: ask them to read, sign, and return it as an email attachment. I think it is better to ask for the demographic questionnaire after the interview, but because some respondents may never return it then, it is probably best to ask for this at the same time as the consent form, in advance of the interview.

What should you bring to the interview? I would recommend bringing two copies of the consent form (one for you and one for the respondent), a demographic questionnaire, a manila folder in which to place the signed consent form and filled-out demographic questionnaire, a printed copy of your interview guide (I print with three-inch right margins so I can jot down notes on the page next to relevant questions), a pen, a recording device, and water.

After the interview, you will want to secure the signed consent form in a locked filing cabinet (if in print) or a password-protected folder on your computer. Using Excel or a similar program that allows tables/spreadsheets, create an identifying number for your interview that links to the consent form without using the name of your respondent. For example, let’s say that I conduct interviews with US politicians, and the first person I meet with is George W. Bush. I will assign the transcription the number “INT#001” and add it to the signed consent form. [3] The signed consent form goes into a locked filing cabinet, and I never use the name “George W. Bush” again. I take the information from the demographic sheet, open my Excel spreadsheet, and add the relevant information in separate columns for the row INT#001: White, male, Republican. When I interview Bill Clinton as my second interview, I include a second row: INT#002: White, male, Democrat. And so on. The only link to the actual name of the respondent and this information is the fact that the consent form (unavailable to anyone but me) has stamped on it the interview number.

Many students get very nervous before their first interview. Actually, many of us are always nervous before the interview! But do not worry—this is normal, and it does pass. Chances are, you will be pleasantly surprised at how comfortable it begins to feel. These “purposeful conversations” are often a delight for both participants. This is not to say that sometimes things go wrong. I often have my students practice several “bad scenarios” (e.g., a respondent that you cannot get to open up; a respondent who is too talkative and dominates the conversation, steering it away from the topics you are interested in; emotions that completely take over; or shocking disclosures you are ill-prepared to handle), but most of the time, things go quite well. Be prepared for the unexpected, but know that the reason interviews are so popular as a technique of data collection is that they are usually richly rewarding for both participants.

One thing that I stress to my methods students and remind myself about is that interviews are still conversations between people. If there’s something you might feel uncomfortable asking someone about in a “normal” conversation, you will likely also feel a bit of discomfort asking it in an interview. Maybe more importantly, your respondent may feel uncomfortable. Social research—especially about inequality—can be uncomfortable. And it’s easy to slip into an abstract, intellectualized, or removed perspective as an interviewer. This is one reason trying out interview questions is important. Another is that sometimes the question sounds good in your head but doesn’t work as well out loud in practice. I learned this the hard way when a respondent asked me how I would answer the question I had just posed, and I realized that not only did I not really know how I would answer it, but I also wasn’t quite as sure I knew what I was asking as I had thought.

—Elizabeth M. Lee, Associate Professor of Sociology at Saint Joseph’s University, author of Class and Campus Life , and co-author of Geographies of Campus Inequality

How Many Interviews?

Your research design has included a targeted number of interviews and a recruitment plan (see chapter 5). Follow your plan, but remember that “ saturation ” is your goal. You interview as many people as you can until you reach a point at which you are no longer surprised by what they tell you. This means not that no one after your first twenty interviews will have surprising, interesting stories to tell you but rather that the picture you are forming about the phenomenon of interest to you from a research perspective has come into focus, and none of the interviews are substantially refocusing that picture. That is when you should stop collecting interviews. Note that to know when you have reached this, you will need to read your transcripts as you go. More about this in chapters 18 and 19.

Your Final Product: The Ideal Interview Transcript

A good interview transcript will demonstrate a subtly controlled conversation by the skillful interviewer. In general, you want to see replies that are about one paragraph long, not short sentences and not running on for several pages. Although it is sometimes necessary to follow respondents down tangents, it is also often necessary to pull them back to the questions that form the basis of your research study. This is not really a free conversation, although it may feel like that to the person you are interviewing.

Final Tips from an Interview Master

Annette Lareau is arguably one of the masters of the trade. In Listening to People , she provides several guidelines for good interviews and then offers a detailed example of an interview gone wrong and how it could be addressed (please see the “Further Readings” at the end of this chapter). Here is an abbreviated version of her set of guidelines: (1) interview respondents who are experts on the subjects of most interest to you (as a corollary, don’t ask people about things they don’t know); (2) listen carefully and talk as little as possible; (3) keep in mind what you want to know and why you want to know it; (4) be a proactive interviewer (subtly guide the conversation); (5) assure respondents that there aren’t any right or wrong answers; (6) use the respondent’s own words to probe further (this both allows you to accurately identify what you heard and pushes the respondent to explain further); (7) reuse effective probes (don’t reinvent the wheel as you go—if repeating the words back works, do it again and again); (8) focus on learning the subjective meanings that events or experiences have for a respondent; (9) don’t be afraid to ask a question that draws on your own knowledge (unlike trial lawyers who are trained never to ask a question for which they don’t already know the answer, sometimes it’s worth it to ask risky questions based on your hypotheses or just plain hunches); (10) keep thinking while you are listening (so difficult…and important); (11) return to a theme raised by a respondent if you want further information; (12) be mindful of power inequalities (and never ever coerce a respondent to continue the interview if they want out); (13) take control with overly talkative respondents; (14) expect overly succinct responses, and develop strategies for probing further; (15) balance digging deep and moving on; (16) develop a plan to deflect questions (e.g., let them know you are happy to answer any questions at the end of the interview, but you don’t want to take time away from them now); and at the end, (17) check to see whether you have asked all your questions. You don’t always have to ask everyone the same set of questions, but if there is a big area you have forgotten to cover, now is the time to recover ( Lareau 2021:93–103 ).

Sample: Demographic Questionnaire

ASA Taskforce on First-Generation and Working-Class Persons in Sociology – Class Effects on Career Success

Supplementary Demographic Questionnaire

Thank you for your participation in this interview project. We would like to collect a few pieces of key demographic information from you to supplement our analyses. Your answers to these questions will be kept confidential and stored by ID number. All of your responses here are entirely voluntary!

What best captures your race/ethnicity? (please check any/all that apply)

  • White (Non Hispanic/Latina/o/x)
  • Black or African American
  • Hispanic, Latino/a/x of Spanish
  • Asian or Asian American
  • American Indian or Alaska Native
  • Middle Eastern or North African
  • Native Hawaiian or Pacific Islander
  • Other : (Please write in: ________________)

What is your current position?

  • Grad Student
  • Full Professor

Please check any and all of the following that apply to you:

  • I identify as a working-class academic
  • I was the first in my family to graduate from college
  • I grew up poor

What best reflects your gender?

  • Transgender female/Transgender woman
  • Transgender male/Transgender man
  • Gender queer/ Gender nonconforming

Anything else you would like us to know about you?

Example: Interview Guide

In this example, follow-up prompts are italicized.  Note the sequence of questions.  That second question often elicits an entire life history , answering several later questions in advance.

Introduction Script/Question

Thank you for participating in our survey of ASA members who identify as first-generation or working-class.  As you may have heard, ASA has sponsored a taskforce on first-generation and working-class persons in sociology and we are interested in hearing from those who so identify.  Your participation in this interview will help advance our knowledge in this area.

  • The first thing we would like to as you is why you have volunteered to be part of this study? What does it mean to you be first-gen or working class?  Why were you willing to be interviewed?
  • How did you decide to become a sociologist?
  • Can you tell me a little bit about where you grew up? ( prompts: what did your parent(s) do for a living?  What kind of high school did you attend?)
  • Has this identity been salient to your experience? (how? How much?)
  • How welcoming was your grad program? Your first academic employer?
  • Why did you decide to pursue sociology at the graduate level?
  • Did you experience culture shock in college? In graduate school?
  • Has your FGWC status shaped how you’ve thought about where you went to school? debt? etc?
  • Were you mentored? How did this work (not work)?  How might it?
  • What did you consider when deciding where to go to grad school? Where to apply for your first position?
  • What, to you, is a mark of career success? Have you achieved that success?  What has helped or hindered your pursuit of success?
  • Do you think sociology, as a field, cares about prestige?
  • Let’s talk a little bit about intersectionality. How does being first-gen/working class work alongside other identities that are important to you?
  • What do your friends and family think about your career? Have you had any difficulty relating to family members or past friends since becoming highly educated?
  • Do you have any debt from college/grad school? Are you concerned about this?  Could you explain more about how you paid for college/grad school?  (here, include assistance from family, fellowships, scholarships, etc.)
  • (You’ve mentioned issues or obstacles you had because of your background.) What could have helped?  Or, who or what did? Can you think of fortuitous moments in your career?
  • Do you have any regrets about the path you took?
  • Is there anything else you would like to add? Anything that the Taskforce should take note of, that we did not ask you about here?

Further Readings

Britten, Nicky. 1995. “Qualitative Interviews in Medical Research.” BMJ: British Medical Journal 31(6999):251–253. A good basic overview of interviewing particularly useful for students of public health and medical research generally.

Corbin, Juliet, and Janice M. Morse. 2003. “The Unstructured Interactive Interview: Issues of Reciprocity and Risks When Dealing with Sensitive Topics.” Qualitative Inquiry 9(3):335–354. Weighs the potential benefits and harms of conducting interviews on topics that may cause emotional distress. Argues that the researcher’s skills and code of ethics should ensure that the interviewing process provides more of a benefit to both participant and researcher than a harm to the former.

Gerson, Kathleen, and Sarah Damaske. 2020. The Science and Art of Interviewing . New York: Oxford University Press. A useful guidebook/textbook for both undergraduates and graduate students, written by sociologists.

Kvale, Steiner. 2007. Doing Interviews . London: SAGE. An easy-to-follow guide to conducting and analyzing interviews by psychologists.

Lamont, Michèle, and Ann Swidler. 2014. “Methodological Pluralism and the Possibilities and Limits of Interviewing.” Qualitative Sociology 37(2):153–171. Written as a response to various debates surrounding the relative value of interview-based studies and ethnographic studies defending the particular strengths of interviewing. This is a must-read article for anyone seriously engaging in qualitative research!

Pugh, Allison J. 2013. “What Good Are Interviews for Thinking about Culture? Demystifying Interpretive Analysis.” American Journal of Cultural Sociology 1(1):42–68. Another defense of interviewing written against those who champion ethnographic methods as superior, particularly in the area of studying culture. A classic.

Rapley, Timothy John. 2001. “The ‘Artfulness’ of Open-Ended Interviewing: Some considerations in analyzing interviews.” Qualitative Research 1(3):303–323. Argues for the importance of “local context” of data production (the relationship built between interviewer and interviewee, for example) in properly analyzing interview data.

Weiss, Robert S. 1995. Learning from Strangers: The Art and Method of Qualitative Interview Studies . New York: Simon and Schuster. A classic and well-regarded textbook on interviewing. Because Weiss has extensive experience conducting surveys, he contrasts the qualitative interview with the survey questionnaire well; particularly useful for those trained in the latter.

  • I say “normally” because how people understand their various identities can itself be an expansive topic of inquiry. Here, I am merely talking about collecting otherwise unexamined demographic data, similar to how we ask people to check boxes on surveys. ↵
  • Again, this applies to “semistructured in-depth interviewing.” When conducting standardized questionnaires, you will want to ask each question exactly as written, without deviations! ↵
  • I always include “INT” in the number because I sometimes have other kinds of data with their own numbering: FG#001 would mean the first focus group, for example. I also always include three-digit spaces, as this allows for up to 999 interviews (or, more realistically, allows for me to interview up to one hundred persons without having to reset my numbering system). ↵

A method of data collection in which the researcher asks the participant questions; the answers to these questions are often recorded and transcribed verbatim. There are many different kinds of interviews - see also semistructured interview , structured interview , and unstructured interview .

A document listing key questions and question areas for use during an interview.  It is used most often for semi-structured interviews.  A good interview guide may have no more than ten primary questions for two hours of interviewing, but these ten questions will be supplemented by probes and relevant follow-ups throughout the interview.  Most IRBs require the inclusion of the interview guide in applications for review.  See also interview and  semi-structured interview .

A data-collection method that relies on casual, conversational, and informal interviewing.  Despite its apparent conversational nature, the researcher usually has a set of particular questions or question areas in mind but allows the interview to unfold spontaneously.  This is a common data-collection technique among ethnographers.  Compare to the semi-structured or in-depth interview .

A form of interview that follows a standard guide of questions asked, although the order of the questions may change to match the particular needs of each individual interview subject, and probing “follow-up” questions are often added during the course of the interview.  The semi-structured interview is the primary form of interviewing used by qualitative researchers in the social sciences.  It is sometimes referred to as an “in-depth” interview.  See also interview and  interview guide .

The cluster of data-collection tools and techniques that involve observing interactions between people, the behaviors, and practices of individuals (sometimes in contrast to what they say about how they act and behave), and cultures in context.  Observational methods are the key tools employed by ethnographers and Grounded Theory .

Follow-up questions used in a semi-structured interview  to elicit further elaboration.  Suggested prompts can be included in the interview guide  to be used/deployed depending on how the initial question was answered or if the topic of the prompt does not emerge spontaneously.

A form of interview that follows a strict set of questions, asked in a particular order, for all interview subjects.  The questions are also the kind that elicits short answers, and the data is more “informative” than probing.  This is often used in mixed-methods studies, accompanying a survey instrument.  Because there is no room for nuance or the exploration of meaning in structured interviews, qualitative researchers tend to employ semi-structured interviews instead.  See also interview.

The point at which you can conclude data collection because every person you are interviewing, the interaction you are observing, or content you are analyzing merely confirms what you have already noted.  Achieving saturation is often used as the justification for the final sample size.

An interview variant in which a person’s life story is elicited in a narrative form.  Turning points and key themes are established by the researcher and used as data points for further analysis.

Introduction to Qualitative Research Methods Copyright © 2023 by Allison Hurst is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License , except where otherwise noted.

Numbers, Facts and Trends Shaping Your World

Read our research on:

Full Topic List

Regions & Countries

  • Publications
  • Our Methods
  • Short Reads
  • Tools & Resources

Read Our Research On:

  • From Telephone to the Web: The Challenge of Mode of Interview Effects in Public Opinion Polls

Table of Contents

  • About This Report
  • Appendix Tables

pm_15.05.12_modestudypromo

Among the most striking trends in the field of survey research in the past two decades is the shift from interviewer-administered to self-administered surveys. Fueled by the growth of the internet, self-administration as a survey mode presents a mixture of opportunities and challenges to the field. Self-administered surveys tend to be less expensive and to provide ways of asking questions that are difficult or impossible to ask in an interviewer-administered survey.

But the results from self-administered and interviewer-administered surveys are sometimes different. This difference is called a mode effect, a difference in responses to a survey question attributable to the mode in which the question is administered. Among the issues this raises are how to track trends in responses over time when the mode of interview has changed and how to handle the inconsistencies when combining data gathered using different modes.

Using its nationally representative American Trends Panel , Pew Research Center conducted a large-scale experiment that tested the effects of the mode of survey interview – in this case, a telephone survey with an interviewer vs. a self-administered survey on the Web – on results from a set of 60 questions like those commonly asked by the center’s research programs . This report describes the effort to catalog and evaluate mode effects in public opinion surveys.

The study finds that differences in responses by survey mode are fairly common, but typically not large, with a mean difference of 5.5 percentage points and a median difference of five points across the 60 questions. The differences range in size from 0 to 18 percentage points. The results are based on 3,003 respondents who were randomly assigned to either the phone or Web mode and interviewed July 7-Aug. 4, 2014 for this study.

Where differences occurred, they were especially large on three broad types of questions: Items that asked the respondent to assess the quality of their family and social life produced differences of 18 and 14 percentage points, respectively, with those interviewed on the phone reporting higher levels of satisfaction than those who completed the survey on the Web.

Questions about societal discrimination against several different groups also produced large differences, with telephone respondents more apt than Web respondents to say that gays and lesbians, Hispanics and blacks face a lot of discrimination. However, there was no significant mode difference in responses to the question of whether women face a lot of discrimination.

Different Answers on Web & Phone

Web respondents were far more likely than those interviewed on the phone to give various political figures a “very unfavorable” rating, a tendency that was concentrated among members of the opposite party of each figure rated.

Statistically significant mode effects also were observed on several other questions. Telephone respondents were more likely than those interviewed on the Web to say they often talked with their neighbors, to rate their communities as an “excellent” place to live and to rate their own health as “excellent.” Web respondents were more likely than phone respondents to report being unable to afford food or needed medical care at some point in the past twelve months.

One important concern about mode effects is that they do not always affect all respondents in the same way. Certain kinds of respondents may be more vulnerable than others to the effect of the mode of interview. In some instances, this may be a consequence of cognitive factors; for example, well-educated respondents may be better able than those with less education to comprehend written questions. In other instances, the sensitivity of a question may be greater for certain respondents than for others; for example, mode effects on questions about financial difficulties may be much larger among low income individuals — the people most likely to experience such troubles. 1

Despite these sometimes substantial differences, the study found that many commonly used survey questions evidence no mode effect. Reports about various personal activities performed “yesterday” – such as getting news from a newspaper, on television or on the radio; calling a friend or relative; writing or receiving a letter; or getting some type of exercise – showed no significant differences by mode of interview. And most questions about religious affiliation, belief and practice yielded similar results on the Web and on the phone, though Web respondents were somewhat more likely than those interviewed on the telephone to say that they “seldom” or “never” attended religious services.

About the Study

Total Respondents

This study was conducted using Pew Research Center’s nationally representative American Trends Panel (ATP). Panelists who normally take their surveys on the Web were randomly assigned to either the phone mode (N=1,494 completed by phone) or the Web mode (N=1,509 completed on the Web). Each set of respondents was independently weighted to be representative of the U.S. public in an effort to ensure that any differences observed between the groups were a result only of mode of interview effects. Mode differences for each question in the study were measured by comparing answers given by the Web and phone groups using a commonly reported category of each question in the study or the category that shows the largest mode difference — whichever is larger.

Why Mode of Interview Effects Occur

The experience of being interviewed by another person differs from completing a survey online or on paper. For example, an interviewer can help respondents stay focused and may be able to provide clarification or encouragement at difficult junctures during the interview.

But the social interaction inherent in a telephone or in-person interview may also exert subtle pressures on respondents that affect how they answer questions. Respondents may feel a need to present themselves in a more positive light to an interviewer, leading to an overstatement of socially desirable behaviors and attitudes and an understatement of opinions and behaviors they fear would elicit disapproval from another person. Previous research has shown that respondents understate such activities as drug and alcohol use and overstate activities like donating to charity or helping other people. This phenomenon is often referred to as “social desirability bias.” These effects may be stronger among certain types of people than others, introducing additional bias into the results. 2

Most of the largest mode differences observed in this study are observed on questions where social desirability bias could play a role in the responses. Of the 21 items showing a difference by mode of at least seven percentage points, seven involve ratings of political figures (and very negative ratings are less prevalent for all seven items on the phone than on the Web), four involve questions about intimate personal issues including life satisfaction, health status and financial troubles (with positive responses more common on the phone across all of them) and three relate to perceptions of discrimination against minority groups (with phone respondents more likely to say there is discrimination against each group). Two other questions that fit within this framework are talking with neighbors and attending religious services. Phone respondents were 11 points more likely than Web respondents to say they talked with neighbors at least a few times a week. Web respondents were seven points more likely than phone respondents to say that they seldom or never attend religious services.

But not all questions that touch on potentially sensitive topics or involve behaviors that are socially desirable or undesirable exhibited mode effects. For example, there was no significant mode difference in how people rated their own personal happiness; or in the percentages of people who said they had done volunteer work in the past year, called a friend or relative yesterday just to talk, or visited with family or friends yesterday. There also were no differences by mode in the shares of people who are religiously unaffiliated, think that a person must believe in God in order to be moral or say that religion is very important in their life.

In addition, there are other sources of mode difference apart from social desirability. Because surveys require cognitive processing of words and phrases to understand a question and choose an appropriate response option, the channel in which the question and options are communicated can also affect responses. A complicated question with many different response options may be very difficult to comprehend when someone hears it on the phone, but easier to process when read online or on paper. Because they are easier to remember, the last response option read by an interviewer may be favored by respondents — a phenomenon called the “recency effect.” This effect is less prevalent in a self-administered survey, where respondents can see all of the response options in a glance or can go back and re-read a question on their own. 3

One question in the survey was lengthy and somewhat complicated and could have posed a greater challenge to phone than Web respondents: an item that asked respondents to place themselves in one or more of a set of racial or ethnic categories. This item was modeled on a new question under review by the U.S. Census that, for the first time, includes Hispanic origin as an option along with the more traditional race categories such as white, black or African American, Asian or Asian American. Yet respondents on the phone and the Web gave nearly identical answers.

The implicit time pressure in an interviewer-administered survey can affect a respondent’s willingness to engage in the amount of thought necessary to recall facts or past events, leading to different answers than would be obtained if no interviewer were involved. And, of course, the absence of an interviewer might make it more likely that some respondents on the Web or on paper decide to speed through a questionnaire in order to finish more quickly, thus providing lower-quality data.

In general, we found little evidence that cognitive processes of these sorts created mode differences in responses. That may reflect the fact that the questions chosen for this study are drawn from well-tested items designed for telephone surveys, and thus do not reflect the kinds of burdensome items that have previously been shown to create mode effects. It’s also possible that the panelists, having participated in one large telephone survey with Pew Research Center (the polarization study that was used to recruit the American Trends Panel) and – for the vast majority – at least one previous panel wave, are savvier survey participants than others and thus are less vulnerable to mode effects than a fresh cross-sectional sample would be.

This report presents the study’s findings in a series of sections organized by the topic of the survey questions used. Following a discussion of the study’s conclusions, there is a detailed methodological description of the study. A table presenting all of the items sorted by the size of the mode differences follows. At the end is a complete topline showing all questions and response categories.

Scope of the Mode Differences

This study is composed of mode of interview comparisons across 60 different questions covering a range of subjects and question formats. Topics include politics, religion, social relationships, daily activities, personal health, interpersonal trust and others. Question formats ranged from simple categorical items (rating one’s community as excellent, good, only fair, poor), to yes/no items (read a newspaper yesterday), to completely open-ended items (how many doctor visits in the past 12 months), to 0-100 rating scales (rate the Democratic and Republican leaders in Congress).

Summary of Mode Differences

Responses to all but four of the 60 items showed at least some numerical difference by mode, and the median difference across all items was 5 percentage points. The largest difference was 18 points, and there were eight items with differences of at least 10 points. But most of the 24 non-zero differences smaller than 5 percentage points are not statistically significant, and thus could have occurred by chance as a result of sampling error.

The following sections of the report provide details about the presence or absence of mode differences across all of the measures included in the study. Each section includes an analysis of the overall results and an examination of the kinds of people most likely to be affected by the mode of interview. In general, only those differences that are statistically significant are highlighted, except in a few instances where there was a strong expectation of a mode effect and none was found.

Sizeable Mode Effects in Political Measures

Some of the largest mode differences in the study are seen in the ratings of political figures. Public views —on both sides of the aisle— are considerably more negative when expressed via the Web than over the phone. The mode differences are even larger when looking at the ratings of political figures by respondents of the opposite political party.

Views of Political Figures More Negative on the Web Than Phone

Hillary Clinton’s ratings are a good example of this pattern. When asked on the phone, 19% of respondents told interviewers they have a “very unfavorable” opinion of Clinton; that number jumps to 27% on the Web. However, like most of the political figures asked about, Clinton’s positive ratings vary only modestly by mode — 53% rate her positively on the Web, compared with 57% on the phone.

The mode effect on very unfavorable ratings is quite large among Republicans and those who lean Republican. Fully 53% of Republicans or Republican leaners hold a “very unfavorable” view of Hillary Clinton on the Web, compared with only 36% on the phone.  There is no difference in Democrats’ unfavorable ratings of Clinton by mode.

A similar mode difference is also evident with ratings of Michelle Obama. While just 16% of phone respondents have a very unfavorable opinion of the First Lady, one quarter (25%) of those on the Web do so. As with Hillary Clinton, mode differences in views about Michelle Obama are largely driven by Republicans and Republican leaners.  While nearly half (46%) of Republicans on the Web had a very unfavorable opinion of the First Lady, less than one-third (31%) gave the same response on the phone — a 15-point difference.

The same patterns seen with Hillary Clinton and Michelle Obama are evident for Republican political figures. Web respondents are 13 points more likely than phone respondents to view Sarah Palin as “very unfavorable.” Among Democrats and Democratic leaners, 63% express a very unfavorable view of Sarah Palin on the Web, compared with only 44% on the phone.

There is a 9-percentage point mode difference in “very unfavorable” ratings of George W. Bush (22% on the phone, 31% on the Web). Among Democrats, 49% on the Web rate the former president very unfavorably, compared with 36% on the phone.

Mode Differences Larger Among Members of the Opposite Party

Half (51%) of Republicans on the Web report “very unfavorable” feelings towards Senate Minority Leader Harry Reid, a result 16 percentage points higher than on the phone (35%). Senate Majority Leader Mitch McConnell sees only a modest difference between negativity by mode; 14% of phone respondents are “very unfavorable” in their rating. That number climbs slightly to 19% among Web respondents.

We also observed a tendency for telephone respondents to give a higher net favorability rating on the phone for Michelle Obama (68% to 59%), George W. Bush (50% to 43%), Mitch McConnell (25% to 17%) and Harry Reid (29% to 18%). Unlike the pattern with “very unfavorable” responses, these more positive ratings occur across party lines, with both Democrats and Republicans rating leaders from both parties more favorably on the phone than on the Web.

A larger share of Web respondents indicated that they had never heard of or could not rate Harry Reid and Mitch McConnell than on the phone. On the Web, 44% would not provide a rating of McConnell, compared with 38% on the phone; 40% of Web respondents had no rating for Reid, compared with 31% of phone respondents. This is likely due to the fact that Web respondents were offered an option labeled “Never heard of/Not sure.” No explicit option was offered to respondents on the telephone, though interviewers were instructed to accept no opinion responses and not push for a rating. The presence or absence of an explicit no opinion option does not matter for the well-known figures like Hillary Clinton or George W. Bush, but makes a significant difference for the lesser-known Reid and McConnell.

research interview phone

The effects of the interview mode on ratings of political figures also appear in a different question format — the “feeling thermometer” scale that varies from 0 to 100. As with the verbal scales, more people express highly negative views on the Web than on the phone. Asked to rate party leaders in Congress on a 0 to 100 scale, 44% of Web respondents give Republican leaders in Congress between 0 and 33, the “cold” or negative end of the scale. When asked on the phone, 37% gave responses in this range. That 7-percentage point difference is the same as with Democratic leaders in Congress (32% on the phone, 39% on the Web). As with ratings of specific political figures, mode differences on these scales are much larger among members of the opposite party of the group being rated.

The use of a numerical scale highlights another difference between Web and phone. Phone respondents are more likely than Web respondents to select an answer closer to the midpoint of the scale (between 34 and 66). When rating Democratic leaders in Congress, 36% of Web respondents selected a number in the middle range, compared with 45% of phone respondents.

Mode effects also appear on opinion questions about political issues. A higher percentage of respondents on the Web than on the phone say the United States should be less involved in resolving the dispute between Israel and the Palestinians (45% say this on the Web, 37% on the phone). The mode difference is even larger among Republicans (14 percentage points more likely to say on the Web than on the phone that the U.S. should be less involved in the conflict) and among white evangelical Protestants (13 percentage points more likely to say “less involved” on the Web). In fact, “less involved” is the most common response for Republicans on the Web (44%), while “more involved” is the modal response for those interviewed by phone (42%).

In thinking about the United States’ global profile more broadly, Web respondents express slightly more reluctance about U.S. involvement in the global economy than do phone respondents. On the phone, 34% of respondents say greater U.S. involvement in the global economy is a bad thing, compared with 39% on the Web.

There is little apparent difference by mode in opinions about the government’s anti-terrorism policies. About equal numbers on the phone (50%) and on the Web (51%) say that the policies have gone too far in restricting civil liberties.

Measures of Discrimination Significantly Affected by Mode

Considerable mode differences were observed on questions about societal discrimination against several groups including gays and lesbians, Hispanics and blacks, with phone respondents more likely than Web respondents to say these groups faced “a lot” of discrimination. But the impact of the mode of interview varied by the race and ethnicity of the respondents.

Sizeable Mode Effects on Measures of Discrimination

When asked about discrimination against gays and lesbians, 62% of respondents on the phone say they face “a lot” of discrimination; on the Web, only 48% give the same answer. This mode effect appears among both Democrats and Republicans. Among Democrats, 77% on the phone say the LGBT community faces “a lot” of discrimination, compared with only 62% on the Web — a 15 point difference. Among Republicans, the difference is 10 points (43% on the phone, 33% on the Web). The mode effect appears among all religious groups in the sample other than white evangelicals, where the 7-point gap between the phone and Web is not statistically significant. Underlying attitudes about gays and lesbians do not appear to affect the likelihood that people will answer differently on the phone than on the Web; the mode effect is significant and similar in size for those who think homosexuality should be discouraged by society and those who think it should be accepted.

Telephone survey respondents were also more likely than Web respondents to say Hispanics face “a lot” of discrimination (54% in the phone survey, 42% in the Web survey). Among Hispanics questioned for this study there is also a difference in response by mode: 41% on the Web say they face discrimination, while 61% on the phone say this. And there is a 14-point mode difference among white respondents. But among black respondents, there was no significant effect: 66% of blacks interviewed by phone said Hispanics face a lot of discrimination, while 61% of those interviewed on the Web said the same.

Racial Divide on Discrimination Against Blacks in Society

When asked about discrimination against blacks, more phone respondents (54%) than Web respondents (44%) said this group faced a lot of discrimination. This pattern was clear among whites, where 50% on the phone and just 37% on the Web said blacks face a lot of discrimination. But among blacks, the pattern is reversed: 71% of black respondents interviewed by phone say they face “a lot” of discrimination, while on the Web 86% do so.

Unlike the items about other minority groups, there is no significant mode difference in responses to the question about women. Exploration of key demographic subgroups showed similar answers on the Web and on the phone, suggesting that social desirability may not influence responses to this question.

Happiness and Life Satisfaction Higher Among Phone Respondents

Sizeable mode differences were observed on questions measuring satisfaction with family and social life. Among phone survey respondents, 62% said they were “very satisfied” with their family life; among Web respondents, just 44% said this. Asked about their social life, 43% of phone respondents said they were very satisfied, while just 29% of Web respondents said this. These sizeable differences are evident among most social and demographic groups in the survey.

The mode differences on satisfaction with social life are smallest among people who, in a different question, say they are “very happy” with their life and larger among those who say they are “pretty happy” or “not too happy.” However, answers to the happiness question itself do not vary by survey mode.

Another question in the series asked about satisfaction with traffic conditions in the respondent’s area. This item had a 6-point mode difference, with slightly more phone respondents (28%) than Web respondents (22%) saying they were very satisfied.

Respondents were also asked about satisfaction with their local community as a place to live. Phone respondents were again more positive, with 37% rating their community as an excellent place to live, compared with 30% of Web respondents. But there was no significant difference by mode in the percentage who gave a negative rating to their community (“only fair” or “poor”).

In a related item, telephone survey respondents were more likely than Web respondents to describe their neighborhood as a “very safe” place to walk after dark (55% in the phone survey, 43% in the Web survey). But few in either mode characterized their neighborhood as “not too safe” or “not at all safe.”

Volunteering, Community Involvement and Social Trust

Fewer Report Neighborly Interactions on the Web Than the Phone

Because neighborliness and civic engagement are considered virtuous behaviors by many people, it would not be surprising to see more reports of these activities in a phone interview than on the Web. Mode differences were observed on some but not all such measures. Respondents on the phone reported more interaction with neighbors than they did on the Web. Similarly, phone respondents were more likely to say they worked in the community to fix a problem or improve a condition, but were not more likely to report engaging in volunteering through an organization. Where mode differences appear, they tend to be larger among higher income individuals.

Asked how frequently they talk to their neighbors in a typical month, 58% of phone respondents report talking to their neighbors “every day” or “a few times a week”; on the Web, 47% report doing so. The mode difference among higher income respondents (those making more than $75,000 a year) is 15 percentage points.

When asked about volunteering either in general or for youth organizations, the percentage of respondents who say they have volunteered in the past year is not significantly different in the Web and phone groups (58% vs. 61% respectively). However, among white evangelical Protestants, a sizable mode effect is observed. Seven-in-ten evangelicals on the phone (71%) report volunteering in the past year, compared with 57% on the Web — a 14-percentage point difference. By comparison, there is no significant mode effect among white mainline Protestants, Catholics or the unaffiliated.

A modest mode effect is observed on a question asking about working with other people in the neighborhood to fix a problem or improve a condition: 38% on the phone report doing so, compared with 33% on the Web. The mode difference was 10 points among higher income respondents.

One other aspect of social capital is social trust, but the study finds no significant mode difference in responses to a standard question about trusting other people.

The Impact of Mode on Reports of Financial Circumstances

Fewer Report Financial Troubles on the Phone Than the Web

A series of questions on personal financial conditions uncover further differences in response by mode. Web respondents are more likely than those on the phone to say that in the past year they have had trouble paying for food for their family and that they were not able to see a doctor when they needed to because of cost. The effect is strongest among those with lower incomes.

Similarly, Web respondents also are more likely than phone respondents to say their standard of living is “somewhat” or “much worse” than their parents’ at a similar age (28% on the Web, 20% on the phone). This mode effect is notable among blacks, with 29% of Web respondents saying they are worse off than their parents, compared with only 9% on phone — a 20-point difference.

More Web respondents (28%) than phone respondents (20%) said they did not have enough money to buy the food their family needed in the past year. Lower income respondents (those making less than $30,000 a year) on the Web are 12 percentage points more likely than those on the phone to say that finding the money for food was an issue (51% on the Web, 39% on the phone).

In a related item, Web survey respondents are somewhat more likely than telephone survey respondents to say that in the past year they needed to see a doctor but were not able to because of cost (22% in the phone survey, 28% on the Web survey). Among non-whites, the mode gap is 17 percentage points (40% on the Web, 23% on the phone). Among whites, there is no difference (22% on the web, 21% on the phone). This question illustrates how a mode effect could lead to an incorrect conclusion: A phone interview would suggest that whites and non-whites have similar access to a doctor, while the web interview shows a sizeable difference in access.

The mode effect is particularly evident among those who say (in a separate question) that they have not seen a doctor at all in the past year. Among phone respondents who report that they have not visited a doctor in the past year, 23% say they have not seen a doctor because of cost; among Web respondents, 46% say this. By contrast, no mode effect is apparent among people who said they have been to the doctor at least once in the past year.

Modest Mode Effects on Measures of Religious Affiliation and Importance

Religion Measures Less Affected by Mode of Interview

The U.S. is among the most religious of all the advanced industrial nations. Accordingly, it is possible that social pressure exists for people to report being personally religious. And yet, the study finds that for most questions about religious affiliation, belief and practice, results were similar on the Web and on the phone.

Respondents are equally likely to identify with a major religious tradition on both the Web and the phone; conversely, respondents are equally likely to identify as unaffiliated with any religion on the Web and on the phone.

Another question in the series asked about the importance of religion in one’s life. No significant mode effect is present on this item. Similarly, no mode difference is seen on a question asking if it is necessary to believe in God to be moral and have good values.

The one exception is that Web respondents were somewhat more likely than those interviewed on the telephone to say that they “seldom” or “never” attended religious services (43% on the Web, 36% on the phone). There is no mode difference in the percent reporting that they attend services at least once a week (31% on the Web, 32% on the phone).

Most of the mode effect is concentrated among people who do not affiliate with a religious tradition. Among the unaffiliated, 60% on the Web say they never attend, compared with 49% on the phone. Among people with a religious affiliation, the differences in reported attendance by mode are comparatively small.

The mode difference in reporting low levels of religious attendance is observed among men but not among women. Half of men on the Web (50%) say they seldom or never attend religious services, compared with one-third (36%) on the phone. Among women, similar numbers on the Web (37%) and phone (36%) report attending seldom or never.

Use of Internet and Technology

Some Technology Use Measures Sensitive to Mode of Interview

Questions about internet usage and technology may be particularly sensitive to survey mode if one of the modes is the internet itself. Using the internet to take a survey may bring to mind thoughts and ideas about technology use that might not arise when taking a survey by phone, simply because the context is directly related to the subject matter. It is also possible that people who are more likely to respond to a survey request when it comes via Web than via phone are different with respect to their technology use, and these differences may not be corrected by weighting.

The share of respondents who reported using the internet every day was not significantly different in the Web and phone groups (84% vs. 82% respectively). But among daily internet users, the difference in regularity of use was significant, with 36% of the Web group indicating that they use the internet constantly, compared with 28% of the phone group. An examination of our panelists’ responses to questions about technology use in previous waves suggests that part of this 8-percentage point difference is attributable to differences in the type of people who were more likely to respond to the Web survey, while part is due to the mode of the interview itself.

We compared the results among respondents who indicated in Wave 1 of the panel (March-April 2014) that they use one of several social networks several times a day. 4 Across both frequent and less frequent social media users, the percentage of Web respondents reporting constant internet use is 5 percentage points higher than for phone respondents. Although frequency of social media use is not a perfect proxy for constant internet use, the fact that the mode difference is identical for frequent and less frequent social media users suggests that people with similar internet usage habits answer the question in different ways depending on the mode in which it is asked.

On the other hand, exploring this previously collected data also reveals that 40% of the Web-mode respondents are frequent social media users, compared with only 30% of phone respondents. This means that the overall mode difference in the percentage reporting constant internet usage is likely a function of both the way respondents answer the question and true differences in internet usage between the two groups.

All of the participants in this experiment were enrolled in the panel for several months prior to the mode study, but they varied considerably in how many previous waves they had responded to. Those who had been regular participants were more apt to respond to this study’s invitation if they were assigned to the Web group than if they were assigned to the phone group (perhaps because that was more familiar to them). Less regular participants were the opposite: They were more likely to respond if assigned to the phone group (perhaps because they are less comfortable on the Web). In short, those who responded on the Web may be more Web savvy than their counterparts in the phone group. 5 Altogether, this serves to demonstrate the difficulties inherent in conducting survey research on topics where the mode of data collection may be related to both survey participation and measurement.

For other technology related items, the effects are smaller. About half (54%) of Web respondents reported playing a game on a computer, mobile device or video game console the previous day, compared with 48% of phone respondents. Web and phone respondents were statistically equivalent in their reporting of worries about computers and technology being used to invade privacy (26% of Web respondents say they worry “a lot” vs. 22% on the phone), sending an email or a text to a friend or relative the previous day (79% for both) and use of a social networking site the previous day (69% on Web vs. to 66% on phone).

Mode Effects for Autobiographical and Factual Knowledge

Answering a survey question requires respondents to recall certain kinds of relevant information and to use this information to help formulate an answer. The mode of the interview can affect this process of recall in different ways, making it easier or more difficult for respondents to perform the necessary search of memory or by providing the motivation to conduct a thorough search. For example, an interviewer may be able to encourage respondents to make the effort to recall if they read a newspaper the previous day. At the same time, the respondents on the telephone may feel some pressure to answer quickly, so as not to keep the interviewer waiting. On the Web, respondents are free to take longer to think about the answer to a question. Time to think can be particularly important for questions that may require respondents to recall autobiographical or factual information from memory. 6

Reports of Doctor Visits

One question that might have been vulnerable to the interviewer’s presence asked for an estimate of the number of times in the past 12 months that the respondent had seen a doctor or other health care professional. Although the distribution of the answers was nearly identical on the Web and the phone, a follow-up question found interesting differences in how respondents arrived at their answers. Offered four options for how they came up with their answer, by a margin of 15% to 7%, more phone than Web respondents said that they estimated the number “based on a general impression.” This difference, though modest in size, could indicate that some phone respondents are more likely to take the easiest possible route to an answer in order to save time. Alternatively, it could reflect a recency effect in that this option was the last of the four to be read to respondents.

Levels of Factual Knowledge

Questions of factual knowledge also require respondents to comb their memory for answers. The time afforded for cognitive processing on the Web may improve performance on these kinds of questions. On the other hand, the presence of an interviewer can provide additional motivation to respondents to think carefully about the questions.

Some researchers have expressed concern that Web respondents can cheat on knowledge quizzes by looking up the answers. 7 The amount of time that respondents took to answer the questions on both the Web and the phone was recorded. For the two questions about factual knowledge, Web respondents took 7-8 seconds longer than phone respondents to answer. In comparison, there was no mode difference in the elapsed time for the attitudinal questions that preceded and followed the knowledge questions. Yet if cheating is occurring, one would expect Web respondents to do better on both questions, and they did not.

This study has examined the impact of survey mode on a wide range of public opinion questions drawn from those commonly asked by Pew Research Center and related organizations. While many of the differences discovered between modes are modest, some are sizeable. And many of the differences are consistent with the theory that respondents are more likely to give answers that paint themselves or their communities in a positive light, or less likely to portray themselves negatively, when they are interacting with an interviewer. This appears to be the case with the questions presenting the largest differences in the study — satisfaction with family and social life, as well as questions about the ability to pay for food and medical care. The fact that telephone respondents consistently exhibit more socially desirable reporting is consistent with a large body of preexisting literature on the topic. For most of these and other differences described here, there is no way to determine whether the telephone or the Web responses are more accurate, though previous research examining questions for which the true value is known have found that self-administered surveys generally elicit more accurate information than interviewer-administered surveys. 8

Another set of questions that exhibited large differences pertained to attitudes toward political figures. For four of the six figures we asked panelists to rate, the net favorable rating was significantly higher in the phone group than in the Web group. At the same time, panelists in the Web group were significantly more likely to choose “very unfavorable” for all six of the individuals asked about. While attitudes about political figures may not be sensitive in the way that reports about family life or financial trouble are sensitive, some recent research has suggested that when interviewers are present, respondents may choose answers that are less likely to produce an uncomfortable interaction with the interviewer. 9 This dynamic may also be in effect among black respondents on the phone who – compared with those surveyed on the Web – are less likely to tell an interviewer that blacks face a lot of discrimination. In the interest of maintaining rapport with an interviewer, respondents may self-censor or moderate their views in ways that they would not online.

Also notable is the fact that these effects of survey mode are distributed unevenly throughout the population. For example, Web respondents were much more likely to rate political figures highly unfavorably when the subject is a member of the opposing party. While blacks interviewed on the phone were less likely to acknowledge discrimination against blacks than were those interviewed on the Web, non-blacks were significantly more likely to do so.

We did see evidence that reports of frequent internet use may be inflated in Web surveys relative to phone surveys, as well as indications that heavy internet users are more prevalent in the Web sample. Although responses to other questions about technology use were largely consistent across modes, researchers should be aware of the potential for differences due to both nonresponse and measurement error when studying these kinds of items.

Yet while significant mode effects are seen on a variety of measures, an equal number displayed only small or non-existent mode differences. Many of the items asking about concrete events, characteristics or attributes did not appear affected by the mode of interview. These included questions about passport and driver’s license ownership, race and religious affiliation, as well as most questions about specific activities engaged in “yesterday.”

What then should survey designers do when deciding among modes of data collection? This study suggests that there may be advantages to self-administered data collection via the Web, particularly if the survey seeks to measure socially desirable or sensitive topics. The willingness of respondents to express more negative attitudes about their personal lives or toward political figures could reflect a greater level of candidness, although we have no way of knowing which set of answers is more consistent with actual behavior outside of the survey context.

That being said, this study can only speak to the possible effects of mode choice on measurement error, which is only one of many possible sources of error that can affect survey quality. Great pains were taken to ensure that the experimental groups were equivalent, and the sample comes from a pre-recruited, probability-based Web panel. Even in this carefully controlled scenario, we found that respondents who had ignored all previous survey requests were more likely to respond when they were contacted over the phone.

Even with declining response rates, telephone surveys continue to provide access to survey samples that are broadly representative of the general public. Many members of the general public still lack reliable access to the internet, making coverage a concern in practice. Random Digit Dial (RDD) phone surveys have been found to perform better than many probability-based Web surveys at including financially struggling individuals, those with low levels of education and linguistic minorities. Researchers should carefully consider the tradeoffs between measurement error on the one hand and coverage and nonresponse error on the other. Studies using both Web and telephone components – so-called mixed mode studies – may become more common, and many researchers believe that self-administration via the internet will eventually become the standard method of survey research. Pew Research Center and other scholars are currently developing methods for combining data collected from different modes so that disruption to long-standing trend data is minimized.

The mode study was conducted using the Pew Research Center’s American Trends Panel , a probability-based, nationally representative panel of US adults living in households. Respondents who self-identify as internet users (representing 89% of U.S. adults) participate in the panel via monthly self-administered Web surveys, and those who do not use the internet participate via telephone or mail. The panel is managed by Abt SRBI.

All current members of the American Trends Panel were originally recruited from the 2014 Political Polarization and Typology Survey, a large (n=10,013) national landline and cellphone random digit dial (RDD) survey conducted January 23-March 16, 2014 in English and Spanish. At the end of that survey, respondents were invited to join the panel. The invitation was extended to all respondents who use the internet (from any location) and a random subsample of respondents who do not use the internet. 10

Data in this report are drawn from the July wave of the panel, which was conducted July 7-August 4, 2014 among 3,351 respondents. In this study, 50% of panelists who typically take their panel surveys via the Web were randomly assigned to take the survey via the Web mode, resulting in 1,509 Web-mode completed interviews. The remaining 50% of the Web panelists were assigned to take the survey via a telephone interview (phone mode), resulting in 1,494 experimental phone-mode completed interviews. The remaining 348 interviews were completed by non-internet panelists typically interviewed by mail. These non-experimental, phone-mode respondents are not considered in the analysis of the experiment in this report but were interviewed to calculate separate general population estimates from the data in this wave of the panel.

As outlined above, all Web panelists were included in the mode study experiment. Those with a mailing address on file were mailed a pre-notification letter, customized for their treatment group (Web vs. phone mode). The letter explained that the next monthly panel wave was a special study, and that we were attempting to obtain the highest level of participation possible. As such, respondents would be given an extra $5 for completing the study beyond their usual incentive amount of $5 or $10, depending on their incentive group. All incentives were contingent upon completing the mode study survey. The letter explained to the Web-mode panelists that an email invitation would be arriving in their inbox between July 14 and 15. The non-experimental phone-mode panelists were told the survey was being conducted via telephone for this month only and that they would hear from an interviewer in the next few days. All Web panelists were also sent a pre-notification email, customized for their treatment group. This email contained the same information as the pre-notification letter sent in the mail.

Next, panelists assigned to the Web-mode treatment were sent a standard invitation email. This was followed by up to four reminder emails for nonrespondents. Panelists assigned to the phone-mode treatment were called up to 10 times. A message was left on the first call if a voicemail or answering machine was reached. No refusal conversion was attempted on soft refusals, so as not to antagonize panelists we hoped to retain for future panel waves. After completion of the survey, respondents were sent the incentive amount referenced in their survey materials via check or Amazon gift card, according to their preference.

The ATP data were weighted in a multi-step process that begins with a base weight incorporating the respondents’ original survey selection probability and the fact that some panelists were subsampled for invitation to the panel. Next, an adjustment was made for the fact that the propensity to join the panel varied across different groups in the sample. The final step in the weighting uses an iterative technique that matches gender, age, education, race, Hispanic origin and region to parameters from the U.S. Census Bureau’s 2012 American Community Survey. Population density is weighted to match the 2010 U.S. Decennial Census. Telephone service is weighted to estimates of telephone coverage for 2014 that were projected from the July-December 2013 National Health Interview Survey. It also adjusts for party affiliation using an average of the three most recent Pew Research Center general public telephone surveys, and for internet use using as a parameter a measure from the 2014 Survey of Political Polarization. Note that for the mode study, separate weights were computed for the web respondents, the experimental phone respondents, all phone respondents (experimental and non) and the total sample. Neither the web respondent weight nor the experimental phone respondent weight included the internet usage parameter in the raking as all respondents in these groups are internet users. Sampling errors and statistical tests of significance take into account the effect of weighting. The Hispanic sample in the American Trends Panel is predominantly native born and English speaking.

The following table shows the unweighted sample sizes and the error attributable to sampling that would be expected at the 95% level of confidence for different groups in the survey:

Sample

Sample sizes and sampling errors for other subgroups are available upon request.

In addition to sampling error, one should bear in mind that question wording and practical difficulties in conducting surveys can introduce error or bias into the findings of opinion polls.

The Web component of the July wave had a response rate of 64% (1,509 responses among 2,345 individuals sampled from the panel); the experimental phone component had a response rate of 63% (1,494 responses among 2,366 individuals sampled from the panel); the total phone component (experimental and non) had a response rate of 63% (1,842 responses among 2,927 Web-based and non-Web individuals sampled from the panel). Taking account of the response rate for the 2014 survey on political polarization (10.6%), the cumulative response rate for the July ATP wave is 3.7%.

Assessing the Equivalence of the Web and Phone Groups

This study was primarily designed to evaluate differences in the way people respond to questions in different modes. In order to isolate the effects of the mode of interview itself, it is essential that comparisons between the Web and telephone groups are not confounded by systematic differences in the composition of each group. Although panel members were randomly assigned to each of the groups, if one mode or the other disproportionately represents people with particular characteristics, then differences in the response patterns may be due to nonresponse rather than the measurement itself. Because all of the panel members were recruited from the same large telephone survey, we know a great deal about both the panel members who responded and those who did not. This includes their demographic characteristics as well as their partisan and ideological leanings. In this section, we will take a look at how different subgroups responded in each mode.

The overall completion rates for each mode were nearly identical, with 64% of panelists assigned to the Web group completing the survey, compared with 63% of panelists assigned to take the survey by phone. With several notable exceptions, response was largely consistent within demographic subgroups. Web and phone response did not differ significantly by age, education, marital status, income or census region. Completion varied somewhat by mode with respect to sex, race and whether the respondent lives in an urban, suburban or rural location. Women were 6 percentage points more likely to complete the survey on the Web than on the phone, whereas the completion rate for men did not differ significantly between modes.

Demographic Composition

Non-Hispanic whites and Hispanics appear slightly more likely to respond in the Web group than the phone group. Non-Hispanic blacks showed the most pronounced difference, with a 56% completion rate in the phone group, compared with a 42% completion rate on the Web. Whereas urban panelists were equally likely to respond in either mode, suburban response was 6 percentage points higher on the Web, while response in rural areas was 10 percentage points higher by phone. Because the Web and phone samples are independently weighted to national parameters on all of the variables just described, the minor differences in the composition of the two groups that resulted from differential response propensities is corrected.

Political and Ideological Composition

Because much of the Pew Research Center’s work relates to politics and public policy, the effects of survey design decisions on the partisan and ideological makeup of our samples is of particular importance. We do see some evidence of a mode preference along ideological lines. Panelists who identified themselves as very conservative in the recruitment survey were 11 percentage points more likely to respond if they were in the phone group. On the other end of the scale, panelists who identified as very liberal are 4 percentage points more likely to respond when in the Web group. The pattern is similar but not identical for partisanship. 11 Here, Republicans and independents who lean Republican are only 2 percentage points more likely to respond in the phone group, while Democrats and those who lean Democratic are 2 percentage points more likely to respond by Web. The largest effect is found among independents who do not lean toward either party, who are 7 percentage points more likely to complete the survey in the Web group.

Despite these differences within groups in the likelihood of response, the overall demographic and partisan distributions among respondents in each mode group are remarkably similar. Prior to weighting, women make up 51% of the Web group and 49% of the phone group. Although non-Hispanic blacks were significantly more likely to complete the survey by phone, the proportion in the phone group is only 3 percentage points higher than in the Web group (9% and 6% respectively). The percentage of respondents with only a high school education or less is 4 points higher in the phone group than in the Web group. The phone group is slightly more rural, more conservative (37% very or somewhat conservative on the phone vs. 32% on the Web) and has a higher proportion of Republicans and Republican leaners than the Web group. After the groups are weighted to account for nonresponse, these differences are all largely reduced.

One sizeable difference in response that is not completely adjusted for in weighting is the regularity with which panelists responded to prior surveys. The completion rate for panelists who had responded to all three prior waves was 97% for the Web group, compared with 83% for the phone group.  In the Web group, the completion rate for panelists who had missed one or more waves was 32%, compared with 44% for the phone group. This is consistent with the notion that despite all of these panelists having access to the internet, some people are easier to reach and recruit by phone than Web. After weighting, 29% of the Web group had missed one or more prior waves, compared with 43% in the phone group.

Despite this difference, the demographic distributions of the two experimental groups remain quite similar. Moreover, we repeated several of our analyses while controlling for the effects of response to previous waves, and our findings with and without these controls were very similar. The sole exception involved questions on internet use and technology. The Web-mode respondents were more likely to report using the internet “constantly” than the phone respondents, possibly because people who are more frequent internet users are also more likely to respond to a Web-based survey invitation. The telephone sample brought in respondents who are less frequent internet users and therefore less likely to respond to a Web-based survey invitation. In more technical terms, there appears to be a strong, direct association between the response mechanism and outcomes pertaining to frequency of internet use.

For Further Reading

Chang, L. and J. A. Krosnick. 2009. “ National Surveys via RDD Telephone Interviewing Versus the Internet .” Public Opinion Quarterly, pages 641–678.

Groves, Robert M. and Robert L. Kahn. 1979. “Surveys by Telephone: A National Comparison with Personal Interviews.” Academic Press.

Kreuter, Frauke, Stanley Presser and Roger Tourangeau. 2008. “ Social Desirability Bias in CATI, IVR, and Web Surveys the Effects of Mode and Question Sensitivity .” Public Opinion Quarterly.

Kolenikov, Stanislav and Courtney Kennedy. 2014. “ Evaluating Three Approaches to Statistically Adjust For Mode Effects .” Journal of Survey Statistics and Methodology.

Presser, Stanley and Linda Stinson. 1998. “Data Collection Mode and Social Desirability Bias in Self-Reported Religious Attendance.” American Sociological Review.

Tourangeau, Roger, Frederick G. Conrad and Mick P. Couper. 2013. “The Science of Web Surveys.” Oxford University Press.

Tourangeau, Roger and Tom W. Smith. 1996. “ Asking Sensitive Questions: The Impact of Data-Collection Mode, Question Format, and Question Context .” Public Opinion Quarterly.

  • For example, in a survey of university alumni, individuals were much more likely to rate a question as sensitive if their answer would place them in a socially undesirable category. See Kreuter, Frauke, Stanley Presser and Roger Tourangeau. 2008. “Social Desirability Bias in CATI, IVR, and Web Surveys the Effects of Mode and Question Sensitivity.” Public Opinion Quarterly. http://poq.oxfordjournals.org/content/72/5/847.abstract . ↩
  • For an overview of the effects of interviewer presence on answers to sensitive questions, see Tourangeau, Roger and Ting Yan. 2007. “Sensitive Questions in Surveys.” Psychological Bulletin. ↩
  • Krosnick, Jon A. and Duane F. Alwin. 1987. “An Evaluation of a Cognitive Theory of Response-Order Effects in Survey Measurement.” Public Opinion Quarterly. http://poq.oxfordjournals.org/content/51/2/201.abstract . ↩
  • Wave 1 of the American Trends Panel was conducted from March 19 to April 29, 2014. Respondents were asked how often they use Facebook, Twitter, Google Plus, YouTube and LinkedIn. ↩
  • See the Methodological Appendix for a discussion patterns in the completion rates for different subgroups. ↩
  • See Tourangeau, Roger, Lance J. Rips and Kenneth Rasinski. 2000. “The Psychology of Survey Response.” Cambridge University Press, chapter 3, for an explanation of how memory and recall function as part of the survey response process. ↩
  • Prior, Markus, and Arthur Lupia. 2008. “Money, Time, and Political Knowledge: Distinguishing Quick Recall and Political Learning Skills.” American Journal of Political Science, pages 169-183. Prior and Lupia describe this process and suggest that when respondents have more time, the survey is not only measuring stored knowledge but also the ability to search for and obtain relevant information. ↩
  • For example, Kreuter, Presser and Tourangeau were able to compare respondent self-reports to administrative records on a number of sensitive items pertaining to academic performance such as failing classes or being placed on academic probation. Respondents who belonged to the sensitive category (e.g., having failed a class) were more likely to falsely deny it when surveyed by an interviewer over the phone than if the survey was self-administered via the Web. See Kreuter, Frauke, Stanley Presser and Roger Tourangeau. 2008. “Social Desirability Bias in CATI, IVR, and Web Surveys the Effects of Mode and Question Sensitivity.” Public Opinion Quarterly. http://poq.oxfordjournals.org/content/72/5/847.abstract . ↩
  • Ye, Cong, Jenna Fulton and Roger Tourangeau. 2011. “More positive or More Extreme? A Meta-Analysis of Mode Differences in Response Choice.” Public Opinion Quarterly. http://poq.oxfordjournals.org/content/75/2/349.short . ↩
  • When data collection for Pew Research Center’s 2014 Political Polarization and Typology Survey began, non-internet users were subsampled at a rate of 25%, but a decision was made shortly thereafter to invite all non-internet users to join. In total, 83% of non-internet users were invited to join the panel. ↩
  • This section refers to party identification as it was measured in Pew Research Center’s 2014 Political Polarization and Typology Survey from which the American Trends Panel was recruited. Party identification was asked again as part of the mode study, but that data is not available for nonrespondents to the mode study. ↩

Sign up for our weekly newsletter

Fresh data delivery Saturday mornings

Sign up for The Briefing

Weekly updates on the world of news & information

  • American Trends Panel
  • Methodological Research
  • Online Surveys
  • Survey Methods
  • Telephone Surveys

Measuring the Risks of Panel Conditioning in Survey Research

Confronting 2016 and 2020 polling limitations, what 2020’s election poll errors tell us about the accuracy of issue polling, measuring religion in pew research center’s american trends panel, polling methods are changing, but reporting the views of asian americans remains a challenge, most popular.

1615 L St. NW, Suite 800 Washington, DC 20036 USA (+1) 202-419-4300 | Main (+1) 202-857-8562 | Fax (+1) 202-419-4372 |  Media Inquiries

Research Topics

  • Email Newsletters

ABOUT PEW RESEARCH CENTER  Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. It conducts public opinion polling, demographic research, media content analysis and other empirical social science research. Pew Research Center does not take policy positions. It is a subsidiary of  The Pew Charitable Trusts .

Copyright 2024 Pew Research Center

Research & Creative Achievement

Telephone screening scripts.

Whether a protocol involves contact potential subjects via telephone about the study, a script is generally required to ensure consistency and completeness in the information that potential subjects are given about the study. This script is intended for situations in which the study team calls the potential participant, but can be adapted to situations in which the potential participant makes the call, e.g. in response to a flyer.

Basic Introductory Telephone Script

Basic telephone scripts to introduce a study should include the following:

  • An introduction that gives the name and affiliation of the person calling, e.g. John Smith from the University of Maryland, Baltimore County.
  • The reason for the call: to invite the person to take part in a research study being conducted by [PI’s name]. The study team member should state why s/he is calling that particular individual about the study, e.g. “You’re being invited to participate because…”.
  • A brief description of the purpose of the study and what the potential participant would need to do if s/he decides to participate. This must also include a statement that participation is voluntary.
  • An opportunity for the potential participant to ask questions. Simply asking “Do you have any questions about the study?” is sufficient.
  • After all questions have been answered, the study team member may ask if the potential participant is interested in proceeding to the next step in recruitment for the study (e.g. scheduling a visit to learn more and go through the consent process, or answering some screening questions).
  • A closing section: For those not interested in the study, the closing should include a thank you for the person’s time. For those who say they are interested, the closing section should include instructions for the next step, e.g. setting an appointment, returning a signed consent form, etc.
  • Language that will be used if the potential participant does not answer the telephone and a) the research team plans to leave a message on an answering machine or voice mail; or b) someone other than the potential subject answers the telephone. An example message could be “Hello, my name is X and I am a(n) _______________ (i.e. sociologist) from the University of Maryland, Baltimore County calling to talk to X about a research study.“

Choose the below to submit with your IRB protocol application:

Telephone screening script

return to Consent and Assent Guidelines

  • Accreditation
  • Consumer Information
  • Equal Opportunity
  • Privacy PDF Download
  • Web Accessibility

Subscribe to UMBC Weekly Top Stories

I am interested in:.

  • I am interested in: Undergraduate
  • I am interested in: Graduate
  • I am interested in: Professional Masters
  • Our Vision & Mission
  • Executive Management
  • Our Structure
  • Our Divisions & Centres
  • Our Standards & Ethics
  • Our History
  • Strategic Documents

Research Data

  • HSRC Repository
  • Special Projects
  • Policy Briefs
  • Thinking about impact
  • Impact Stories
  • Partner with us
  • HSRC Review
  • Press Releases
  • Staff login
  • HSRC Anti-fraud and Ethics Hotline

Science in Society

  • How do I plan the structure of a short video?

Image by Gerd Altmann from Pixabay

Back to Science Communication Toolkit Back to Impact Resources homepage

A short video typically includes interviews with researchers (and others) and b-roll footage (supplemental or alternative footage intercut with the main shot).

If you are shooting informal footage with your cell phone in the field, always use the phone in landscape format by tilting it on the side:.

research interview phone

  • It helps to conceptualise a video structure  before  jumping into more formal interview shoots to avoid a laborious video editing process later.
  • All shoots at specific venues need to be planned carefully to make sure the weather allows and entrance is possible. If the videographer is hired or researcher is flown in, your lack of planning may be costly.
  • To avoid a series of talking heads, the video interview footage is enriched with b-roll footage, e.g. footage of a Protea flower swaying in the wind while the researcher speaks about climate change affecting Cape fynbos. B-roll footage in the field shot in landscape mode on your phone is extremely valuable.
  • Short video productions need to be very tight (2-5 mins) to retain the attention of a popular audience. They can be super short 30s clips too, depending on the platform. Longer versions may entail creating programmes on topics, which is a specialist field for broadcast journalists and documentary film makers.

Example of planned steps:

1) Conceptualise the video structure (order of story elements), also who to interview on which subject, estimated venue, transport and other logistics, b-roll photographic footage and infographics needed. Such a structure could look like this:

  • Enter with b-roll footage taken on Robben Island while the introductory voice or text speaks to a challenge (short text, done by voice-over artist, the researcher during the interview or by a community member affected by the issue)
  • The frame then shifts to the face of the speaker and interview with researcher who will talk about the HSRC work done.
  • Researcher explains the research findings ― frames alternate between researcher’s face, b-roll footage and infographics.
  • Frame shifts back to the researcher’s face who talks about what these findings mean and why they are important
  • Then shifts back to community member’s face who speaks about impact/collaboration ― we can include more b-roll
  • Addressing the conclusion and  What next?  Question: researcher’s voice with face then roll out with some b-roll footage and final HSRC/DSI branding frame.

2) Book videographer, interviewees and venues

3) If you are directing the video, be present at the shoot to guide reshooting of sections where serious errors are made. If you are being interviewed, ask the interviewer for the planned script in 1, so you can prepare some answers.

4) The videographer will clean up the recording and insert the b-roll footage in the appropriate places. He will also help with infographics and the correct branding elements. ​ 5) At this stage, the video clip is likely to be too long. The footage is then cut and edited for length and flow. If the story structure is badly planned or unplanned, this editing may be time consuming, requiring that sound bites are moved around and costly voice-overs arranged to fill in crucial missing bits. In some cases, the videographer needs to shoot more b-roll footage to cover visual errors.

Impact Centre colleagues are available to support your efforts to produce science communication videos. Antonio is the HSRC’s videographer and is available to assist with the technical aspects, Antoinette/Andrea/Kim are available to provide support on structure, scripts, directing and platforms.

The HSRC’s YouTube channel is available to host your video

Link to the next section:

  • Who is my target audience?
  • What do I want to share?
  • What should my word count be?
  • How do I structure an article?
  • How can I use stories in my communication?
  • I need help with language and style
  • What about footnotes/bibliographies/references?
  • Talking about the HSRC: Are we diluting our brand?
  • Focus on the researcher: Conveying the So What? and writing a short biography
  • How do I structure a PowerPoint presentation?
  • How do I take a useful photograph?
  • Useful links on science communication
  • I am no digital native and need help with these : ​​​​​​​​​​​​​​creating hyperlinks, tracking edits in Word, making edits in Pdf, sending large documents and folders via WeTransfer
  • Visualise your communication for impact
  • HSRC events: Requirements for drafting and sending invitations

This toolkit is designed to help HSRC researchers to communicate information about their research effectively to maximise impact.​​​​​​​

research interview phone

Impact Resources homepage

Browse by Topic

  • Impact and engagement
  • Covid-19 pandemic
  • HIV/AIDS, STIs and TB
  • Continental networking
  • Science, technology and innovation
  • Economic development
  • Opinion and analysis
  • Social cohesion

Latest News

  • Office of the CEO
  • Safe, inclusive and resilient communities
  • HSRC Publishing
  • Human and Social Capabilities
  • Creative research methodologies
  • Research excellence
  • Inclusive development
  • DCEO: Research
  • Public health
  • Food security
  • Supply Chain Management
  • Equitable Education and Economies
  • Service delivery
  • Information, geospatial analytics and digital curation

Research Outputs

  • African agenda
  • Education and skills development
  • Capable, ethical and developmental state

Browse by Division / Centre

  • Capacity Development and Internships
  • Developmental, Capable and Ethical State
  • Public Health, Societies and Belonging (PHSB)
  • Equitable Education and Economies (EEE)
  • Africa Institute of South Africa
  • Centre for Science, Technology and Innovation Indicators
  • Impact Centre
  • Deputy CEO: Research
  • eResearch Knowledge Centre
  • Office of the CFO
  • GE Support Service
  • Capacity, Growth & Innovation (CGI)

Browse information about current and completed research projects.

Access is provided to research outputs generated by HSRC researchers since 2000. All research outputs are provided free of charge to the public, with the exception of confidential reports.

The HSRC Research Data Service provides a digital repository of the HSRC's research data in support of evidence-based human and social development.

  • Work & Careers
  • Life & Arts

Become an FT subscriber

Try unlimited access Only $1 for 4 weeks

Then $75 per month. Complete digital access to quality FT journalism on any device. Cancel anytime during your trial.

  • Global news & analysis
  • Expert opinion
  • Special features
  • FirstFT newsletter
  • Videos & Podcasts
  • Android & iOS app
  • FT Edit app
  • 10 gift articles per month

Explore more offers.

Standard digital.

  • FT Digital Edition

Premium Digital

Print + premium digital, ft professional, weekend print + standard digital, weekend print + premium digital.

Essential digital access to quality FT journalism on any device. Pay a year upfront and save 20%.

  • Global news & analysis
  • Exclusive FT analysis
  • FT App on Android & iOS
  • FirstFT: the day's biggest stories
  • 20+ curated newsletters
  • Follow topics & set alerts with myFT
  • FT Videos & Podcasts
  • 20 monthly gift articles to share
  • Lex: FT's flagship investment column
  • 15+ Premium newsletters by leading experts
  • FT Digital Edition: our digitised print edition
  • Weekday Print Edition
  • Videos & Podcasts
  • Premium newsletters
  • 10 additional gift articles per month
  • FT Weekend Print delivery
  • Everything in Standard Digital
  • Everything in Premium Digital

Complete digital access to quality FT journalism with expert analysis from industry leaders. Pay a year upfront and save 20%.

  • 10 monthly gift articles to share
  • Everything in Print
  • Make and share highlights
  • FT Workspace
  • Markets data widget
  • Subscription Manager
  • Workflow integrations
  • Occasional readers go free
  • Volume discount

Terms & Conditions apply

Explore our full range of subscriptions.

Why the ft.

See why over a million readers pay to read the Financial Times.

International Edition

Money latest: Inflation finally expected to come in near 2% target - with announcement at 7am

Inflation is forecast to have fallen to around the Bank of England's 2% target in April - with an announcement at 7am. Read this and the rest of today's consumer and personal finance news - and leave a comment - below.

Wednesday 22 May 2024 06:17, UK

  • Inflation could hit 2% target - with announcement at 7am on Wednesday
  • What is inflation?
  • UK economy heading for 'soft landing' but mistakes have been made, says IMF

Essential reads

  • Easiest countries for Britons to retire
  • Money Problem : 'My second-hand Ford is being written off with a known issue - but no one is taking responsibility'
  • How to sell your home without an estate agent
  • Best of the Money blog - an archive

Ask a question or make a comment

Inflation is forecast to have fallen to around the Bank of England's 2% target in April - with an announcement at 7am on Wednesday.

A fall in the energy price cap at the start of April is a key reason - energy prices have been a driver of the huge price rises seen in the last two years.

The Bank itself forecast inflation could hit 2.1% for April - down from 3.2% in March and the 40-year high of 11.1% in October 2022.

Respected insights firm Pantheon Macroeconomics thinks it could hit 2% due to "four components"...

  • Ofgem's 12.3% price cap cut that took effect in April will slice 40bp off CPI inflation;
  • Food inflation shaves another 14bp off overall inflation, as fading commodity-price rises pass through to consumers;
  • The BRC Shop Price Index suggests another fall in goods inflation, with help from weaker global costs, cutting headline inflation by 36bp;
  • Services inflation is estimated to have fallen to 5.4% in April, from 6.0% in March - a range of items such as mobile bills and rents saw hikes, but these were less severe than those a year previously.

Industry figures released on Tuesday suggest food inflation is indeed falling...

The main upward driver of inflation in April was the price of petrol, which spiked during the month due to Iran-Israel tensions. 

Capital Economics also forecasts overall CPI could dip to 2% - and says we could see inflation at 1% later this year, lower than the BoE expects.

It thinks energy prices will keep falling and "that, in response to the previous falls in agricultural commodity prices, CPI food inflation will fall below zero". 

Inflation falls will influence how soon we get an interest rate cut - as rates are kept high in order to tame inflation by squeezing the economy.

The next Bank of England decision is on 20 June.

Pantheon says: "Markets are pricing a 60% chance of a rate cut in June and more than fully pricing one by August, so the key question is not whether the MPC will ease soon but how quickly it will cut again after the first reduction.

"We expect one cut per quarter to a 3.5% terminal rate, but with risks skewed to the BoE pausing temporarily at 4.0% to 4.5%."

Basically, inflation is the rate at which prices are rising.

It directly affects our overall cost of living and, if wages are not increasing at the same pace, the value of your money decreases.

It is impacted by lots of different factors including global conflicts - with the Ukraine war having a huge impact on food and gas prices in recent years. Some argue Brexit also had a negative impact.

In the UK, inflation is measured monthly - comparing how much prices are going up with the same time a year previous.

The headline inflation figure, which you'll see a lot in the news, measures price rises across a range of products that we need in our daily lives.

The most commonly used inflation index is the Consumer Price Index (this is the update at 7am today) - and the target for many Western governments is 2%.

One thing to note is that falling inflation doesn't mean prices are coming down - just that they're rising less quickly. You'd need a minus figure, or negative inflation, to see prices fall overall.

Why does inflation impact interest rates?

The Bank of England raises interest rates to try to slow spending and encourage saving - when this happens prices/inflation tend to come down.

When inflation falls, interest rates tend to.

Potential winners and losers from high inflation

Overall, a high and volatile rate of inflation is widely considered to be damaging for the economy – but there are some people who could benefit from it.

Workers with wage bargaining power (perhaps those who belong to strong trade unions) can come off better as they can protect their incomes by bidding for higher wages.

Producers could end up benefitting if their prices rise quicker than their costs.

People with stocks or property could also see the value of their assets rise if there is a sustained period of price inflation.

However, retired people on fixed incomes are likely to be worse off as inflation cuts the real value of their pensions and other savings.

The poorest members of the population will also feel the pinch more as costs of borrowing, food and domestic utilities are high. 

Nationwide has doubled its maximum personal loan amount from £25,000 to £50,000. 

The building society said its decision would help customers cope with rising construction costs, with many turning to loans to help with home improvement projects. 

More than four in 10 households were hoping to start renovation plans and the end of this month, it said. 

But, with recent forecasts predicting a 15% uplift in building costs over the next five years, they could be facing much steeper bills. 

"With the impact of inflation and other external pressures, the costs of construction have seen significant increases in recent years," said head of personal loans at Nationwide Darren Bailey. 

"However, we recognise many people will want to continue with the home improvement plans, even if that means they have downscaled their plans to accommodate their budget.

"Our increased maximum personal loan size of £50,000 means we have an option to suit everyone." 

The increased loan amount will be available to Nationwide current account holders.

Customers can apply for the loan online, in branch or on the phone and receive the money the same day.

PrettyLittleThing has been reprimanded for posting "misleading" adverts, marking its ninth warning in four years. 

Complaints about the online fashion retailer were made following its Black Friday social media campaign in November. 

The Advertising Standards Authority investigated 15 posts on X, which mentioned several discounts ranging from 30% to 99%. 

PrettyLittleThing said the period had become well known to consumers in the UK as a time when business offered promotions. 

It also said its advertising and marketing was "reflective of what their customers wanted and expected", and "further terms and conditions were on their website".

However, the ASA found three issues with the campaign. 

It said some misled customers into believing all products were included in the promotion, when some were excluded. 

Others failed to specify when the deal ended, and some breached advertising rules with the "inclusion of closing dates from ads when promotional periods were shortened or extended". 

In previous rulings, the company was found to have: broken the ASA's rules on offence and responsibility by objectifying women; run social media partnerships with influencers that did not make clear they were ads; run promotions that had not been administered fairly.

"While one breach of our rules is one too many, the bulk of the rulings were between 2020-22," the ASA told the Money team.

"To date, two have been published in the last two years. We hope the downward trend in upheld rulings continues but will take action if we see any ongoing issues." 

Hotel Chocolat is to open 20 stores over the next 18 months and is expanding its online reach. 

The shops will open in retail parks and feature a different design concept. 

Speaking during Retail Gazette's Future Consumer Debate, the company's head of customer marketing, Amy Harman, said shoppers want to be able to see, smell and feel the chocolate they're buying. 

"That's very much what we’re looking to do in our stores," she said. 

Explaining the choice to target retail parks, she said that's where the footfall and demand is. 

"There's more intent with consumers. They're going for a day out and going for an experience," she added.

The announcement comes as the brand prepares to launch on TikTok shop in the coming weeks as it looks to expand its online audience. 

Dishoom has launched a legal battle over the cockney rhyming slang for curry. 

The restaurant chain, which can be found in London, Manchester, Edinburgh and Birmingham, has asked for the term Ruby Murray to be freed up by the Intellectual Property Office. 

At the moment, a man named Tariq Aziz has the rights to the name for food and drink uses, with an IPO filing showing he registered the trademark in 2019. 

A search of the IPO site shows an application to revoke the trademark registration on the grounds that it has never been used was filed on 15 May by Dishoom Limited. 

Mr Aziz now has until 15 July to indicate whether he will offer a defence.

If he fails to do so, the trademark will be revoked and removed from the register. 

A Dishoom spokesperson told the Money team it doesn't believe the trademark's owner has ever used it. 

"There is a principle of 'use or lose it' in trademark law and we have therefore asked the UK IPO to remove the Ruby Murray mark from the register," they said. 

"Dishoom is not seeking to apply to register Ruby Murray in its own name; it wishes to remove the 'monopoly' on the use of Ruby Murray so it can be used freely by anyone when referring to curry." 

Dishoom does have a chicken ruby on its menu and in its cookbook and the Money team are big, big fans.

The International Monetary Fund has said the UK economy is heading for a "soft landing", but reiterated its message to Jeremy Hunt that he should not have cut National Insurance at the last two fiscal events.

In its annual check-up on the state of Britain's economy, the Washington-based Fund raised its forecast for gross domestic product growth this year from 0.5% to 0.7%, saying: "The UK economy is approaching a soft landing, with a recovery in growth expected in 2024, strengthening in 2025."

The Fund now expects inflation to come down to close to 2% in the coming months, and the Bank of England to cut interest rates by as much as three quarters of a percent this year, and then another percentage point next year.

The chancellor welcomed the Fund's Article IV report, saying: "Today's report clearly shows that independent international economists agree that the UK economy has turned a corner and is on course for a soft landing.

"The IMF have upgraded our growth for this year and forecast we will grow faster than any other large European country over the next six years - so it is time to shake off some of the unjustified pessimism about our prospects."

Government 'won't meet its debt target'

However, the IMF, which has warned the government explicitly in the past not to cut taxes too fast, in the face of rising spending projections in future, said the two 2p National Insurance contribution cuts at the last two fiscal events were a mistake.

"In light of the medium-term fiscal challenge", the report said, "staff would have recommended against the NIC rate cuts, given their significant cost."

The Fund's staff also believes the government is not on track to meet its main fiscal rule, which commits it to cutting the national debt in five years' time.

It believes net debt will carry on rising towards 97% of GDP in the following years, instead of falling back to 93% of GDP, as the Office for Budget Responsibility has forecast.

The Fund's double-edged report comes amid improving news for the UK.

Data released two weeks ago showed the country ended its short-lived recession with faster than expected growth in the first quarter of the year.

The Office for National Statistics is expected to announce tomorrow that inflation dropped close to the Bank of England's 2% target in April. That may enable the Bank to begin cutting interest rates from their 5.25% level in June or August.

Bank should speak more

The Fund's report contained a number of other recommendations for economic policy in the UK, including that the Bank of England should commit to more news conferences to explain its decisions, and that the government should consider imposing road charges to replace the revenue lost from fuel duty as electric cars become more predominant on UK roads.

For many Britons, retirement means moving somewhere new.

New analysis from relocation experts Property Guides has found the easiest locations for retirees, taking into account culture, visa requirements, cost and more. 

Landing in the number one spot is Ireland, with a lack of visa requirements, English-speaking residents and relatively "safe and happy" environment.

Spain, Portugal and Cyprus claim the next three spots on the list.

However, Spain is high on the minimum annual income requirement.

"Spain's is one of the most expensive. It is currently around €27,000 (£23,000) per year for the first applicant. Just over the border in Portugal, it is less than €8,500 per year. Turkey's is the cheapest, working out at a little over £5,000, while Italy requires over €30,000," Property Guides says.

Turkey also came out well for the low cost of living - unlike New Zealand.

European countries in general offer visas aimed specifically at those receiving pensions or investment incomes, according to Property Guides.

Commonwealth countries such as Australia and Canada, however, actively restrict those over 55 from moving there, even if they have a high passive income (income such as pensions, that doesn't require a job). 

It becomes easier if retirees have children who are already legal residents.

"Golden visas", which encourage wealthy people to invest in a country, are becoming less common. 

"Most countries are now cancelling their residential investment option, including Cyprus and Portugal, and Spain will soon be closing its own. However, for now, you can still get one in Spain, Greece and Turkey, for as little as a €250,000 property, and these we have judged the easiest to retire to."

Property Guides also looked at health services. They took rankings from a Legatum Prosperity Index. 

"Top scorers were Germany, Italy and France, in that order. Bottom of the pile was the USA."

The research noted that state pensions are not uprated for retirees in Canada, New Zealand and Australia.

It also factored in "sunshine hours", with the top three being Cyprus, Portugal and the US. Ireland came last here.

By Sarah Taaffe-Maguire , business reporter 

Good news for motorists: oil prices are at a more than two-month low at $83.08 for a barrel of Brent crude oil.

Lower prices will likely filter down to the pumps in about 10 days.

But it's not such good news for those in the Brixham area.

The parent company of South West Water - who supplies the Devon area - said 15% don't have normal service.

Shares in Pennon Group, which also owns Bournemouth Water and Bristol Water, fell 6.7% after it reported flat pre-tax profit - £16.8m was recorded for the 2023-24 financial year, the same as 2022-23.

That's despite shareholders being in line for a higher payout of 44.37p  a share.

Drug maker AstraZeneca is one of the best-performing stocks on the FTSE 100 index of most valuable London-listed companies today.

After it announced it aims to double revenues by 2030, the share price rose 0.53%. 

If you're buying dollars, you can get $1.27 for your pound or €1.17. 

By James Sillars , business reporter 

Grocery inflation has eased to its lowest level since October 2021, according to industry data released before official figures tipped to show a big dent in the overall pace of price increases in the economy.

Kantar Worldpanel - which tracks supermarket till prices, sales and market share - said its measure of grocery inflation slowed to 2.4% in the four weeks to 12 May from 3.2% the previous month.

The measure showed there is still upward pressure on the cost of items such as chilled fruit juices, drinks, sugar confectionery and chocolate confectionery - the latter a consequence of poor cocoa harvests.

Prices were still falling fastest in toilet tissues, butter and milk, the report said. It has previously pointed to wider assistance in falling costs from a price war among supermarkets.

Fraser McKevitt, Kantar's head of retail and consumer insight, said: "Grocery price inflation is gradually returning to what we would consider more normal levels. It's now sitting only 0.8 percentage points higher than the 10-year average of 1.6% between 2012 and 2021, which is just before prices began to climb.

"However, after nearly two and a half years of rapidly rising prices, it could take a bit longer for shoppers to unwind the habits they have learnt to help them manage the cost of living crisis."

Read more on this story below ...

Be the first to get Breaking News

Install the Sky News app for free

research interview phone

  • Skip to main content
  • Keyboard shortcuts for audio player

Morning Edition

  • Latest Show
  • About The Program
  • Contact The Program
  • Corrections

Listen to the featured story from this episode.

One voted Biden. One picked Trump. It's a tale of two counties in pivotal Wisconsin

Located less than an hour outside Madison, Wis., Columbia County has both city commuters and people in more rural, small towns. Portage, with a population of around 10,000, is the largest town in the county. Jeongyoon Han/NPR hide caption

One voted Biden. One picked Trump. It's a tale of two counties in pivotal Wisconsin

Wisconsin is one of a handful of pivotal states in the 2024 presidential election. Within the swing state, there are swing counties that could decide the election — even as people remain divided.

Middle East

Morning news brief.

by  Leila Fadel ,  Steve Inskeep

'The Riot Report' looks back on the violence in the U.S. during the summer of 1967

Looting and rioting has stopped in new caledonia but problems remain.

by  Leila Fadel ,  Eleanor Beardsley

Scarlett Johansson demands answers about the voice for ChatGPT's personal assistant

by  Steve Inskeep ,  Bobby Allyn

It's rare when a mother and her child graduate from college together

More than 200 million seniors face extreme heat risks in coming decades, study finds

Jackye Lafon, who's in her 80s, cools herself with a water spray at her home in Toulouse, France during a heat wave in 2022. Older people face higher heat risk than those who are younger. Climate change is making heat risk even greater. Fred Scheiber/AFP via Getty Images hide caption

More than 200 million seniors face extreme heat risks in coming decades, study finds

by  Alejandra Borunda

Having a child with a rare genetic disease is difficult. It's even harder in Gaza

The battle over policy at the u.s.-mexico border is picking up steam in congress.

by  Leila Fadel ,  Claudia Grisales

As the war in Ukraine drags on, Zelenskyy's popularity has waned

Ex-south african president zuma is banned by a court from running for parliament.

by  Emmanuel Akinwotu

One voted Biden. One picked Trump. It's a tale of two counties in pivotal Wisconsin

by  Elena Moore ,  Jeongyoon Han

ICC prosecutor says Israel's Netanyahu is responsible for crimes including starvation

by  Steve Inskeep ,  Daniel Estrin

Following a corruption court case, the NRA elects new leaders

A toronto blue jays fan got an unexpected souvenir after attending a game, 4 teams are left in the nba playoffs: celtics, pacers, mavericks and timberwolves, icc makes unprecedented move seeking arrest warrant for close u.s. ally israel, iranian foreign minister hossein amirabdollahian dies in weekend helicopter crash, trump's hush money trial in new york could wrap up this week.

by  Steve Inskeep ,  Andrea Bernstein

How is Israel responding to the International Criminal Court seeking arrest warrants?

Nonprofit trains young singers in philadelphia and beyond.

by  Buffy Gorrilla

Examining Israel's response to the ICC's application for arrest warrants

Zelenskyy's popularity in ukraine has fallen. will he hold elections.

by  Steve Inskeep

Searching for a song you heard between stories? We've retired music buttons on these pages. Learn more here.

COMMENTS

  1. Conducting qualitative interviews by telephone: Lessons learned from a study of alcohol use among sexual minority and heterosexual women

    Introduction. Although the use of telephones for collecting quantitative survey data is common and well-represented in research literature, using telephones for qualitative interviews has generally been considered an inferior alternative to face-to-face interviews (Novick, 2008).Qualitative interviews provide "a way of generating empirical data about the social world by asking people to talk ...

  2. How to Carry Out Great Interviews in Qualitative Research

    A qualitative research interview is a one-to-one data collection session between a researcher and a participant. Interviews may be carried out face-to-face, over the phone or via video call using a service like Skype or Zoom. There are three main types of qualitative research interview - structured, unstructured or semi-structured.

  3. Interviewing by Telephone: Specific Considerations, Opportunities, and

    A number of studies conducted in the 1970s compared the efficacy and reach of interviewing by telephone with face-to-face interviewing. This research highlighted the benefits of this medium, suggesting that data quality was comparable between face-to-face and telephone interviews.

  4. Interviewing by Telephone: Specific Considerations, Opportunities, and

    average of 50 minutes while telephone interviews took an average of 30 minutes—a time savings of close to 50%. Telephone interviews also have smaller personnel needs (Miller & Salkind, 2003). Fewer interviewers, supervisory staff, and coordination staff increase the efficiency of interviewing by phone. The Impact of Telephone Interviews

  5. Telephonic Qualitative Research Interviews: When to consider them and

    Abstract. Purpose: Theaims of this study are to: (1) review the literature examiningthe. arguments for and against the telephonic qualitative research interview, (2) to. develop criteria for ...

  6. (PDF) How to Conduct an Effective Interview; A Guide to Interview

    Vancouver, Canada. Abstract. Interviews are one of the most promising ways of collecting qualitative data throug h establishment of a. communication between r esearcher and the interviewee. Re ...

  7. Conducting In-Depth Interviews via Mobile Phone with Persons with

    Telephone Interview as a Method of Collecting Qualitative Data. Previously, telephone interviews have been used as a last resort for collecting qualitative research data [3,19,20].The most common concerns about telephone interviews are that they might have a negative impact on the richness and quality of the collected information [], the challenges in establishing rapport [21,22], and the ...

  8. (PDF) Phone Interviewing as a Means of Data Collection ...

    The phone interview followed phone interview procedures for qualitative data collection coined by Burke and Miller (2001), including pre-, during, and post-interview phases. All the garnered ...

  9. How to Conduct a Qualitative Interview (2024 Guide)

    Here are some steps on how to analyze a qualitative interview: 1. Transcription. The first step is transcribing the interview into text format to have a written record of the conversation. This step is essential to ensure that you can refer back to the interview data and identify the important aspects of the interview.

  10. Telephone Interviewing as a Qualitative Methodology for ...

    The protocol described in this chapter for telephone interviews has been refined over three studies (total N = 236) and 10 years (2007-2017) of research. The protocol has been shown to be a flexible and effective way to collect qualitative data on practices, networks, projects, and biographical histories in the virtual CI communities under study.

  11. Phone Interviews

    Many researchers tend to view phone research as inferior to other forms of gathering interview data, and have low expectations for the quality of phone data (Novick, 2008). In fact, some researchers have expressed the opinion that phone interviews correlate with highly structured, closed-ended questions and therefore do not have the ability to ...

  12. Chapter 11. Interviewing

    Introduction. Interviewing people is at the heart of qualitative research. It is not merely a way to collect data but an intrinsically rewarding activity—an interaction between two people that holds the potential for greater understanding and interpersonal development. Unlike many of our daily interactions with others that are fairly shallow ...

  13. Types of Interviews in Research

    There are several types of interviews, often differentiated by their level of structure. Structured interviews have predetermined questions asked in a predetermined order. Unstructured interviews are more free-flowing. Semi-structured interviews fall in between. Interviews are commonly used in market research, social science, and ethnographic ...

  14. How to conduct research with phone interviews

    Back to the phone interview. A phone interview is literally a direct line to your target audience — whether that's an internal sales team that simply needs new messaging for assets or a much broader group of stakeholders who needs an organizational strategic framework. For starters, it's just so freaking time- and cost-efficient.

  15. Comparing Telephone and Face-to-Face Qualitative Interviewing: a

    This research note reports the results of a comparison of face-to-face interviewing with telephone interviewing in a qualitative study. The study was designed to learn visitors' and correctional officers' perceptions of visiting county jail inmates.

  16. How To Start a Phone Interview (With Tips and Examples)

    Use the following steps to guide you when you prepare to initiate your interview phone call: 1. Greet the individual who answers the call. First, greet the individual who answers your call. For instance, a receptionist may be the professional who you greet, and they need to know your purpose for calling and who you are contacting.

  17. (PDF) The use of telephone interview for research

    The telephone interview was used in a study by one of the authors (EC) which explored the experience of postoperative pain. Issues relating to ethical considerations, reliability, validity ...

  18. PDF Conducting In-Depth Interviews via Mobile Phone with Persons with

    experiences of the use of telephone during semi-structured research interviews, from the perspective of participants and researchers. Data were collected from mobile phone interviews with 32 individuals who had common mental disorders or multimorbidity which were analyzed thematically, as well as field notes reflecting researchers' experiences.

  19. 25 Phone Interview Tips To Get You to the Next Round

    8. Smile. Even though it's a phone conversation, smiling during your interview can promote a positive tone in your voice. Though your interviewer won't be able to see your smile, they'll be able to pick up on your positivity. Before your phone interview, practice smiling in front of a mirror or with family or friends. 9.

  20. From Telephone to the Web: The Challenge of Mode of Interview Effects

    Pew Research Center and other scholars are currently developing methods for combining data collected from different modes so that disruption to long-standing trend data is minimized. ... The remaining 50% of the Web panelists were assigned to take the survey via a telephone interview (phone mode), resulting in 1,494 experimental phone-mode ...

  21. Telephone Screening Scripts

    Telephone Screening Scripts. Whether a protocol involves contact potential subjects via telephone about the study, a script is generally required to ensure consistency and completeness in the information that potential subjects are given about the study. This script is intended for situations in which the study team calls the potential ...

  22. How do I plan the structure of a short video?

    2) Book videographer, interviewees and venues. 3) If you are directing the video, be present at the shoot to guide reshooting of sections where serious errors are made. If you are being interviewed, ask the interviewer for the planned script in 1, so you can prepare some answers. 4) The videographer will clean up the recording and insert the b ...

  23. 10 Common Interview Questions and How to Answer Them

    To prepare for your interview and make a great first impression, you can explore this list of 10 common interview questions and plan your responses to them. 1. Tell me about yourself. This warm-up question is your chance to make an impactful first impression. Be prepared to describe yourself in a few sentences. You can mention: Your past ...

  24. Is there a middle way on children and smartphones? This researcher

    Roula Khalaf, Editor of the FT, selects her favourite stories in this weekly newsletter. The debate on children's use of smartphones can veer towards two extremes. There are those who see a ...

  25. Money latest: Popular restaurant chain launches legal bid over cockney

    The average age of a first-time buyer is 33 - but 2.2% are now in the 56-65 age bracket. This compares with 44.8% aged 18-30 and 35.6% aged 31-40, according to data from Legal & General. Further ...

  26. Interviewing by Telephone: Specific Considerations, Opportunities, and

    An interview guide was developed based on the research question and informed by a prior systematic review of qualitative studies (Sweeney et al., 2020) and input from the research team comprising ...

  27. Morning Edition for May 21, 2024 : NPR

    The battle over policy at the U.S.-Mexico border is picking up steam in Congress. by Leila Fadel, Claudia Grisales. less than 1 min. Audio will be available later today.

  28. USDA

    Access the portal of NASS, the official source of agricultural data and statistics in the US, and explore various reports and products.