Browser does not support script.

  • 30 Year Anniversary
  • Methods surgery

Methods Short Courses

  • Software tutorials
  • Seminar series

Image of lecture slides

Historically, one of the main reasons for the establishment of the Department of Methodology was the identification by the Economic and Social Research Council of a 'methods gap' in the training of research students. Among the stipulated requirements was a set of research skills for each discipline. Given the considerable overlap in these research skills across departments, the Department of Methodology was established to meet some of these needs efficiently.

A key function of the Department of Methodology is to provide training for PhD and MSc students in the design of social research and in qualitative and quantitative analysis. Methodology courses are open to all LSE students taking MSc or PhD programmes. This means that research students are welcome to attend any of the Department's courses, regardless of which department they are based in. They should seek approval from their research tutors or supervisors.

Watch our introductory Methodology course videos .

Methodology courses for MSc students

Methodology courses for PhD students

Methodology Workshops

We're offering workshops under MY530 and MY560 on a variety of specific and advanced topics in qualitative and quantitative research methods.

These workshops are primarily aimed at PhD students but may be taken by Masters students. The workshops are non-examinable, self-contained and may be attended independently. Workshops usually take place in Winter and Spring Terms, with the schedules and course outline available from the MY530 and MY560  Moodle pages , and in the relevant workshop tabs below. 

Explore our Methods Short Courses collection and sign up via Eventbrite , which opens for bookings two weeks prior to the corresponding workshop. 

As part of the Doctoral Training Partnerships (DTPs) network, the Department reserves and small number of places on MY530 and MY560 workshops for non-LSE external visitors. Interested parties can also sign up via Eventbrite under external ticket admissions (non- LSE) . 

Please be sure to check out the  Frequently Asked Questions section below. For queries about workshops, please email  [email protected] .

MY530 Advanced Qualitative Research 2023/24

Spring term  .

Week 7 - 13 June 2024 Fieldwork Preparation and Logistics: Budgets, Ethics, Participant Recruitment by Dr Florian Kern

Sign up here:  https://www.eventbrite.co.uk/e/fieldwork-preparation-logistics-budgets-ethics-participant-recruitment-tickets-915358861997?aff=oddtdtcreator

MY560 Advanced Quantitative Research 2023/24

Spring term    .

Week 7 - 11 June 2024 How to analyse data from population registers and other administrative sources by Dr Ben Wilson  

Sign up here:  https://www.eventbrite.co.uk/e/how-to-analyse-data-from-population-registers-and-administrative-sources-tickets-914688065627?aff=oddtdtcreator

Frequently Asked Questions 

I cannot find the Methods Workshop schedule/ information/ sign- up link. Any news relating to the Methods Workshop will be updated on this page. If you cannot find the relevant courses or schedule that you are looking for, please stay tuned. Please note that sign- up links will only become available two weeks prior the event. Thank you for your patience. I am an LSE student, where can I find the relevant details for the workshops? Please visit our Moodle pages  MY530 MY560 I am not an LSE student; can I still sign up for Methods Workshops?

Yes, our workshops are open to the public but spaces are limited. Please sign up for our workshops via our Eventbrite page . Be sure to follow our Eventbrite page so you don't miss our public events.  When do Methods Workshops run? You can find a list of our upcoming workshops on this page or on our Eventbrite page . Our workshops run during the Winter Term and Spring Term. Please find the LSE Term dates here . Where are these workshops held? These workshops are held in-person only at CON 1.01, Department of Methodology, Connaught House, 65 Aldwych, London WC2B 4DS. Unless stated otherwise on the event page/ Moodle. I can no longer attend the workshop I signed up for.

Please diligently cancel your ticket on Eventbrite to allow another waitlisted participant to join. Will I get a certificate for attending these workshops? Attendance Certificates are available upon request, please contact our team at [email protected] The workshop I signed up for has been cancelled. Apologies for the inconvenience caused. Should any workshop be cancelled, a member of staff will contact you by email and let you know whether the workshop has been rescheduled or not.

Books

ESRC DTP Methods training

Coding

SPSS and Stata Software tutorials

Student Group

Methods surgery Drop-in sessions for methods related queries during term time

  

qualitative research methods lse

ME305 Qualitative Research Methods

Course info.

Dr Aliya Rao  &  Dr Chana Teeger

The purpose of this course is to equip participants with the skills to be able to sensitively and critically design, carry out, report, read, and evaluate qualitative research projects, focussing on in-depth interviews and participant observation.

It is taught by qualitative research experts who have experience of using the methods they teach. It covers the full cycle of a field-based qualitative research project: from design, to data collection, analysis, reporting and dissemination.

The course has the dual aims of equipping students with conceptual understandings of current academic debates regarding methods, and the practical skills to put those methods into practice.

Fetching learning content...

Research Methods, Data Science and Mathematics

Build the insights, tools and techniques to support your projects and decisions, and fast-track your career.

20220810_LSE_SS_2693-Pink_Yellow_Turquoise

The ability to analyse any problem or topic, to bring evidence to the table in support of your decisions, to back your arguments with facts – the skill of research is a business-critical competence that is not only in increasingly high demand today; it is a skill that will future-proof your career tomorrow. Whatever you decide to do, whichever industry or sector you decide to pursue. Research Methods, Data Science and Mathematics at LSE Summer School offer you a slew of state-of-the-art courses. From statistics and machine learning to real analysis to qualitative research, our courses will give you a rock-solid grounding in the techniques, the tools and the methods you will need to ensure success.

Led by the LSE departments of Statistics, Mathematics and Methodology – world-leaders in research and teaching – our LSE Summer School courses in Research give you privileged access to the latest thinking and to some of the foremost faculty working in their field today. You will explore the core ideas and concepts and translate theory into practice through dynamic and hands-on learning experiences and experiments. And you will develop both actionable understanding and the real-world tools and methodologies to bring real impact to your work, to your organisation and your career – whatever your role, your sector or your future aspiration.

Research Methods, Data Science and Mathematics courses at LSE Summer School have been expertly designed to be equally impactful for students with some previous experience and understanding of these fields, as well as curious and motivated newcomers. 

20220720_LSE_SS_2045

Ranked #1 for percentage of world-leading research

ME205: Strategic Decision Making: An Introduction to Operations Research Methods

Research Methods, Data Science, and Mathematics

ME205: Strategic Decision Making: An Introduction to Operations Research Methods

In this course, you will be introduced to a diverse set of Operations Research methods and applications

Jonathan

The fundamentals of my course are covered at my home institution, but the summer school course gives me an extra breadth into how the industry works. It’s been a really good experience in diversifying my skill set.

Jonathan Tam, Canada University of Leeds/Transport for London

Find information about our Faculty and their research

Professor Kenneth Benoit

Summer School Programme Director, Research Methods, Data Science and Mathematics - Director of the Data Science Institute - Professor of Computational Social Science

Dr James Abdey

Associate Professor (Education)

Dr Ahmad Abdi

Associate Professor

Dr Jack Blumenau

Guest Lecturer

Dr Ali Boyle

Assistant Professor in Philosophy

Dr Jonathan Cardoso-Silva

Assistant Professor (Education)

Dr Christoph Czichowsky

Dr Thomas Ferretti

Dr Kostas Kalogeropoulos

Dr Milt Mavrakakis

Dr Gelly Mitrodima

Dr Katerina Papadaki

Dr Eleanor Power

Dr Aliya Rao

Assistant Professor

Dr Paola Romero

Guest Teacher

Professor Johannes Ruf

Professor of Mathematics

Dr Chana Teeger

Dr Milena Tsvetkova

Assistant Professor of Computational Social Science

Professor Luitgard Veraart

Professor Alex Voorhoeve

Dr Kate Vredenburgh

You may also be interested in

SS_2023_322_Hi-Res-1440x960

Applications for Summer School 2024 are now open

SS_2023_1231-1440x960

Get answers to any questions you may have about LSE Summer School

SS_2023_450-1440x960

The learning experience

Find out more about how and what you’ll learn at Summer School

LSE - Small Logo

  • About the LSE Impact Blog
  • Comments Policy
  • Popular Posts
  • Recent Posts
  • Subscribe to the Impact Blog
  • Write for us
  • LSE comment

Adam Jowett

April 20th, 2020, carrying out qualitative research under lockdown – practical and ethical considerations.

37 comments | 1112 shares

Estimated reading time: 5 minutes

How can qualitative researchers collect data during social-distancing measures? Adam Jowett outlines several techniques researchers can use to collect data without face-to-face contact with participants. Bringing together a number of previous studies, he also suggests such techniques have their own methodological advantages and disadvantages and that while these techniques may appear particularly apt during the coronavirus crisis, researchers should take time to reflect on ethical issues before re-designing their studies.

The Covid-19 crisis is affecting the way that we work and we’re all learning how to work more remotely. It may also affect the way we go about conducting research. Many researchers are having to suspend data collection or re-design their projects taking into account social-distancing measures.  

Much qualitative research typically relies on face-to-face interaction for data collection through interviews, focus groups and field work. But there are myriad ways researchers and students can collect qualitative data online or gather textual data that already exists. A book by social scientists Virgina Braun, Victoria Clarke and Deborah Gray titled Collecting Qualitative Data: A practical guide to textual, media and virtual techniques provides useful guidance on what these various techniques have to offer, what kinds of research questions they are most suitable for answering as well as specific ethical issues that require consideration. Other useful resources that have been crowd sourced include the LSE Digital Ethnography Collective Reading List and Deborah Lupton’s Doing Fieldwork in a Pandemic . These techniques can be broadly divided into data generation techniques (where the researcher generates data) and data sampling techniques (where the researcher collects texts that are already in existence).  

For data generation, perhaps the most obvious is the use of video-calling (e.g. Skype/Zoom) or the use of text-based instant messaging (e.g. WhatsApp) to virtually replicate the face-to-face interview or focus group. Notwithstanding problems, such as participants not being able to use the technology or having a poor WIFI connection, video-calling is a close substitute to in-person interviewing and can allow for data to be collected over large geographical areas even when social distancing measures are not in place. In addition to video-calling, online surveys can also be used to collect qualitative data by asking respondents to type their responses to open-ended questions. Although qualitative surveys generate less rich data than interviews, they do maintain some of the benefits of qualitative research (e.g. the generation of unanticipated findings) and allow for data collection from a larger number of people, relatively quickly.

In terms of data sampling techniques, there is a wealth of potential data sources available. For example, print media (e.g. news and magazine articles ) can easily be used to analyse social representations of a wide range of topics. Broadcast media (e.g. television or radio discussion programmes) can imitate focus group discussions on topics, meanwhile published autobiographies or blogs can provide first-person narratives for examining a wide range of human experience. Social scientists have also conducted qualitative analyses of textbooks , websites , political speeches and debates , patient information literature and so on. Online discussion forums and social media have also been used to examine a wide range of social phenomena. There may even be open-access qualitative data archives of research interviews and focus groups that you could use for your own purposes. Such sources of data can be useful whether you’re amid a global pandemic or not. For instance, they are easily accessible and the researcher is arguably examining the ‘real’ social world rather than artificially generating data specifically for the purposes of research. For student research, it can also be less ethically risky than inexperienced researchers interviewing people on potentially sensitive topics.    

qualitative research methods lse

While all of these are viable options, they come with their own methodological advantages and disadvantages. For example, some may question whether media sources produced for specific purposes and that have undergone unknown editorial processes are valid sources of data for research. This depends on what research question you are trying to answer. While the validity of novel data sources is an important issue to consider, the validity of interviews and focus groups have also been questioned , not least due to social desirability biases and researchers directing the lines of questioning. Interviewing people online during a global pandemic (which is variously affecting everyone’s state of mind and behaviour) may also have implications for the validity of the research. Some might decide they wish to study the coronavirus (Covid-19) pandemic itself. While such studies might provide a useful snapshot of life during the crisis, the benefit of hindsight might provide a fuller picture. There are also complex ethical issues to consider when thinking of conducting research during a global pandemic and each of the methodological techniques suggested above come with unique ethical considerations.  

The first thing to bear in mind is that the health and wellbeing of participants and researchers should take priority over research timelines and thesis/dissertation deadlines. So, while it may be possible to change your interviews from face-to-face to online interviews, researchers should consider whether asking people to participate in research at this time will put them under any additional unnecessary stress. For example, attempting to conduct online interviews with health professionals would most likely be inappropriate in the current context. Furthermore, if you are thinking of modifying your method of data collection, you should ensure you inform your ethics committee beforehand.  

While analysing media content, policy documents and other official public content is relatively straightforward ethically speaking, content generated online by the public (e.g. forums, blogs, vlogs, reader comments) can be more ethically controversial. The key consideration is what constitutes ‘public’ or ‘private’ online and how might such research be received by those individuals or communities whose content has been used. Researchers should also check if their professional bodies have any specific guidance regarding online data collection. For example, the British Psychological Society has ethical guidelines for internet-mediated research . Researchers should also not assume that they do not require ethical approval to conduct such research.    

This pandemic is leading us all to reflect on how we do things. Researchers should take time to pause and reflect on whether data collection can be postponed. For example, if you’re doing a PhD you could focus on desk-based aspects of the research (e.g. literature reviewing, writing up a section of the thesis). However, some students (e.g. Masters students) may not have this luxury and in the mid to longer term we may want to consider how we can conduct qualitative research at a physical distance. Many things about how we work may change as a result of this crisis and how we conduct qualitative research may well be one of them.  

Note: This article gives the views of the author, and not the position of the LSE Impact Blog, nor of the London School of Economics. Please review our  comments policy  if you have any concerns on posting a comment below.

Featured image credit adapted from: Fathromi Ramdlon  via Pixabay.

Print Friendly, PDF & Email

About the author

qualitative research methods lse

Adam Jowett is Associate Head of the School of Psychological, Social & Behavioural Sciences at Coventry University. Adam was formerly Secretary of the British Psychological Society’s Qualitative Methods in Psychology Section and has used textual, media and online methods in much of his own qualitative research. 

37 Comments

This is a really helpful piece, thank you

helpful resource indeed helped me do my fieldwork when I felt stuck

  • Pingback: Coronavirus-19’s 2 metre rule as a grid reaction – CIM Blog

Thank you, really useful piece!

A great piece Adam. Many things to consider, and perhaps write to in a reflexive statement.

The security of platforms for video-calls and other virtual methods needs careful consideration in terms of both ethics (privacy, anonymity, confidentiality,) and GDPR compliance ( obtaining and filing informed consent, data security and storage)

Privacy and security of online data collection is a serious issue, I’ve written a guide on this here: https://www.quirkos.com/blog/post/online-interviews-focus-groups-qualitative-research

Really important points Muir – that are often missing from discussions about online research methods

Hey Adam, This is a brilliant, informative and timely piece. Thank you!

Thoughtful and timely. Thanks

I finished my doctoral research late 2019. The data were collected through the WhatsApp audio application, transcribed using the Google Documents Dictate feature. The codes were constructed using Quirkos. I have no doubt that the main questions during pandemic should be adressed by qualitative research, because we are to deal with people emotions and everybody is affraid of dying.

Very informative and insightful piece! I myself had to come back from the field much earlier than expected due to the coronavirus crisis. Your article gave me some ideas how to continue my research without breaching the social distancing policies (although it might still be hard considering it’s about food and senses). Many thanks!

  • Pingback: How do we collect data during COVID-19? | Marianne Brittijn

This is a useful piece. COVID-19 raises new challenges to data collection, but you offer some ways around it if we must do it now.

Thank you for your insights. I find it very helpful since I’m planning to collect data for my thesis with Refugees in their camp, and unfortunately, I can not travel for my fieldwork. Thanks a lot.

Kindly add What’s App in the sharing tools to Linked In, Tweeter, FaceBook….., since you alsomentioned it as a tool for data collection.

  • Pingback: Discuss the implications of the coronavirus outbreak for marketing researchers – hanienduong

Interesting piece, you lead me to consider how I may approach my research for my Masters Dissertation next year and the options that are available within these challenging and unprecedented times.

  • Pingback: Ressources Académiques pour les études en post-pandémie – ECCSUM – Association des Étudiant.e.s en Communication aux Cycles Supérieurs de l'Université de Montréal
  • Pingback: Conducting Qualitative Research During Times of Uncertainty – City REDI Blog

Thanks very much for this really useful blog and also interesting to read the comments. At ODI we have a live site where we are compiling pieces on doing research remotely. We update it monthly. I have included this blog as well as the blog by Daniel Turner of Quirkos. Please all do send in any pieces you may know of and we can continue to share, learn, etc. Thanks! https://www.odi.org/publications/16977-primary-data-collection-covid-19-era

  • Pingback: How lockdown took a toll on conducting qualitative research | Research Snappy

Thank you, Adam for this very interesting post. In case of interest, at the Evaluation Support Service of DG Development Cooperation of the EC we are managing the initiative Evaluation in Crisis (#EvalCrisis), which brings together similar reflections on adaptation of evaluation processes and methods (including qual and quant evidence gathering) in times of a global crisis such as the present Covid-19 one. This crisis is not only a dramatic series of events, but also an unprecedent occasion for learning and cross-fertilisation across different professions and fields of expertise. For those readers that are interested, our initiative is at https://europa.eu/capacity4dev/devco-ess

Our project involves ethnographic fieldwork (digital) involving the NHS sites, staff and NHS patients. Has anyone got approval for utilising these methods (using Teams, NearMe or WhatsApp). Most platforms banned by NHS as considered not secure.

  • Pingback: Three Lessons for PhD students thinking of changing topic | Impact of Social Sciences

Hi! Very insightful read, may I just ask is there any reference on APA regarding this? It will be very much helpful if there are guidelines from APA on how to conduct virtual interviews in the midst of pandemic! Thank you!

  • Pingback: Virtual Churchill: Research Methodology - Court Education Australia

Thanks Adam, for this insightful piece. Of interest, I started my national field research with Irish secondary school teachers in late March 2020, three weeks into a full lockdown. However, the pandemic AIDED my research process, as collecting data (audio interviews – phone, transribed with Happyscibe and coded using Quirkos) was easier to schedule with my participants working from home. Some mentioned that they welcomed the intrustion, a break from the intensity of on-line classrooms, and it in a number of instances, it allowed time to break the interview sessions into two or three smaller sessions depending on their time constraints. Another advantage was that I had a allowed a number of weeks, but I found that due to travel savings I could shorten the fieldwork timeframe. Collectively within the college environs, those completing research managed a substantial saving in terms of carbon footprint (travel wise) but, I’m not sure if we added our carbon footprint with increased data centre demands. Very insightful, and it feels somewhat surreal that we are in a phase three lockdown with perhaps more to come. Thanks for Sharing

I can’t tank you enough for this piece.

  • Pingback: It’s good to text – Conducting Qualitative Interviews through Instant Messaging and E-mail – Methods in Chemistry Education Research Portal
  • Pingback: Fieldwork in the Covid-19 era – nsee.online
  • Pingback: Qualitative data collection in the time of coronavirus… - Doctor Elena GR
  • Pingback: Overcoming the Challenges of a Pandemic When Writing Your Dissertation | Students@LSE
  • Pingback: Conducting social science research during the Coronavirus pandemic – Centre for Equity Studies
  • Pingback: Pandemic publishing: rethinking editorial ethics during COVID | LSE COVID-19
  • Pingback: Pandemic publishing: rethinking editorial ethics during COVID | Impact of Social Sciences
  • Pingback: Hybrid research methods learned during the pandemic present a more just and sustainable future for participatory research | Impact of Social Sciences

Leave a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Notify me of follow-up comments by email.

Related Posts

qualitative research methods lse

Are we all digital scholars now? How the lockdown will reshape the post-pandemic digital structure of academia.

April 10th, 2020.

qualitative research methods lse

The COVID-19 online pivot: Adapting university teaching to social distancing

March 12th, 2020.

qualitative research methods lse

The Coronavirus (COVID-19) outbreak highlights serious deficiencies in scholarly communication

March 5th, 2020.

qualitative research methods lse

Big Qual – Why we should be thinking big about qualitative data for research, teaching and policy

March 4th, 2019.

qualitative research methods lse

Visit our sister blog LSE Review of Books

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Neurol Res Pract

Logo of neurrp

How to use and assess qualitative research methods

Loraine busetto.

1 Department of Neurology, Heidelberg University Hospital, Im Neuenheimer Feld 400, 69120 Heidelberg, Germany

Wolfgang Wick

2 Clinical Cooperation Unit Neuro-Oncology, German Cancer Research Center, Heidelberg, Germany

Christoph Gumbinger

Associated data.

Not applicable.

This paper aims to provide an overview of the use and assessment of qualitative research methods in the health sciences. Qualitative research can be defined as the study of the nature of phenomena and is especially appropriate for answering questions of why something is (not) observed, assessing complex multi-component interventions, and focussing on intervention improvement. The most common methods of data collection are document study, (non-) participant observations, semi-structured interviews and focus groups. For data analysis, field-notes and audio-recordings are transcribed into protocols and transcripts, and coded using qualitative data management software. Criteria such as checklists, reflexivity, sampling strategies, piloting, co-coding, member-checking and stakeholder involvement can be used to enhance and assess the quality of the research conducted. Using qualitative in addition to quantitative designs will equip us with better tools to address a greater range of research problems, and to fill in blind spots in current neurological research and practice.

The aim of this paper is to provide an overview of qualitative research methods, including hands-on information on how they can be used, reported and assessed. This article is intended for beginning qualitative researchers in the health sciences as well as experienced quantitative researchers who wish to broaden their understanding of qualitative research.

What is qualitative research?

Qualitative research is defined as “the study of the nature of phenomena”, including “their quality, different manifestations, the context in which they appear or the perspectives from which they can be perceived” , but excluding “their range, frequency and place in an objectively determined chain of cause and effect” [ 1 ]. This formal definition can be complemented with a more pragmatic rule of thumb: qualitative research generally includes data in form of words rather than numbers [ 2 ].

Why conduct qualitative research?

Because some research questions cannot be answered using (only) quantitative methods. For example, one Australian study addressed the issue of why patients from Aboriginal communities often present late or not at all to specialist services offered by tertiary care hospitals. Using qualitative interviews with patients and staff, it found one of the most significant access barriers to be transportation problems, including some towns and communities simply not having a bus service to the hospital [ 3 ]. A quantitative study could have measured the number of patients over time or even looked at possible explanatory factors – but only those previously known or suspected to be of relevance. To discover reasons for observed patterns, especially the invisible or surprising ones, qualitative designs are needed.

While qualitative research is common in other fields, it is still relatively underrepresented in health services research. The latter field is more traditionally rooted in the evidence-based-medicine paradigm, as seen in " research that involves testing the effectiveness of various strategies to achieve changes in clinical practice, preferably applying randomised controlled trial study designs (...) " [ 4 ]. This focus on quantitative research and specifically randomised controlled trials (RCT) is visible in the idea of a hierarchy of research evidence which assumes that some research designs are objectively better than others, and that choosing a "lesser" design is only acceptable when the better ones are not practically or ethically feasible [ 5 , 6 ]. Others, however, argue that an objective hierarchy does not exist, and that, instead, the research design and methods should be chosen to fit the specific research question at hand – "questions before methods" [ 2 , 7 – 9 ]. This means that even when an RCT is possible, some research problems require a different design that is better suited to addressing them. Arguing in JAMA, Berwick uses the example of rapid response teams in hospitals, which he describes as " a complex, multicomponent intervention – essentially a process of social change" susceptible to a range of different context factors including leadership or organisation history. According to him, "[in] such complex terrain, the RCT is an impoverished way to learn. Critics who use it as a truth standard in this context are incorrect" [ 8 ] . Instead of limiting oneself to RCTs, Berwick recommends embracing a wider range of methods , including qualitative ones, which for "these specific applications, (...) are not compromises in learning how to improve; they are superior" [ 8 ].

Research problems that can be approached particularly well using qualitative methods include assessing complex multi-component interventions or systems (of change), addressing questions beyond “what works”, towards “what works for whom when, how and why”, and focussing on intervention improvement rather than accreditation [ 7 , 9 – 12 ]. Using qualitative methods can also help shed light on the “softer” side of medical treatment. For example, while quantitative trials can measure the costs and benefits of neuro-oncological treatment in terms of survival rates or adverse effects, qualitative research can help provide a better understanding of patient or caregiver stress, visibility of illness or out-of-pocket expenses.

How to conduct qualitative research?

Given that qualitative research is characterised by flexibility, openness and responsivity to context, the steps of data collection and analysis are not as separate and consecutive as they tend to be in quantitative research [ 13 , 14 ]. As Fossey puts it : “sampling, data collection, analysis and interpretation are related to each other in a cyclical (iterative) manner, rather than following one after another in a stepwise approach” [ 15 ]. The researcher can make educated decisions with regard to the choice of method, how they are implemented, and to which and how many units they are applied [ 13 ]. As shown in Fig.  1 , this can involve several back-and-forth steps between data collection and analysis where new insights and experiences can lead to adaption and expansion of the original plan. Some insights may also necessitate a revision of the research question and/or the research design as a whole. The process ends when saturation is achieved, i.e. when no relevant new information can be found (see also below: sampling and saturation). For reasons of transparency, it is essential for all decisions as well as the underlying reasoning to be well-documented.

An external file that holds a picture, illustration, etc.
Object name is 42466_2020_59_Fig1_HTML.jpg

Iterative research process

While it is not always explicitly addressed, qualitative methods reflect a different underlying research paradigm than quantitative research (e.g. constructivism or interpretivism as opposed to positivism). The choice of methods can be based on the respective underlying substantive theory or theoretical framework used by the researcher [ 2 ].

Data collection

The methods of qualitative data collection most commonly used in health research are document study, observations, semi-structured interviews and focus groups [ 1 , 14 , 16 , 17 ].

Document study

Document study (also called document analysis) refers to the review by the researcher of written materials [ 14 ]. These can include personal and non-personal documents such as archives, annual reports, guidelines, policy documents, diaries or letters.

Observations

Observations are particularly useful to gain insights into a certain setting and actual behaviour – as opposed to reported behaviour or opinions [ 13 ]. Qualitative observations can be either participant or non-participant in nature. In participant observations, the observer is part of the observed setting, for example a nurse working in an intensive care unit [ 18 ]. In non-participant observations, the observer is “on the outside looking in”, i.e. present in but not part of the situation, trying not to influence the setting by their presence. Observations can be planned (e.g. for 3 h during the day or night shift) or ad hoc (e.g. as soon as a stroke patient arrives at the emergency room). During the observation, the observer takes notes on everything or certain pre-determined parts of what is happening around them, for example focusing on physician-patient interactions or communication between different professional groups. Written notes can be taken during or after the observations, depending on feasibility (which is usually lower during participant observations) and acceptability (e.g. when the observer is perceived to be judging the observed). Afterwards, these field notes are transcribed into observation protocols. If more than one observer was involved, field notes are taken independently, but notes can be consolidated into one protocol after discussions. Advantages of conducting observations include minimising the distance between the researcher and the researched, the potential discovery of topics that the researcher did not realise were relevant and gaining deeper insights into the real-world dimensions of the research problem at hand [ 18 ].

Semi-structured interviews

Hijmans & Kuyper describe qualitative interviews as “an exchange with an informal character, a conversation with a goal” [ 19 ]. Interviews are used to gain insights into a person’s subjective experiences, opinions and motivations – as opposed to facts or behaviours [ 13 ]. Interviews can be distinguished by the degree to which they are structured (i.e. a questionnaire), open (e.g. free conversation or autobiographical interviews) or semi-structured [ 2 , 13 ]. Semi-structured interviews are characterized by open-ended questions and the use of an interview guide (or topic guide/list) in which the broad areas of interest, sometimes including sub-questions, are defined [ 19 ]. The pre-defined topics in the interview guide can be derived from the literature, previous research or a preliminary method of data collection, e.g. document study or observations. The topic list is usually adapted and improved at the start of the data collection process as the interviewer learns more about the field [ 20 ]. Across interviews the focus on the different (blocks of) questions may differ and some questions may be skipped altogether (e.g. if the interviewee is not able or willing to answer the questions or for concerns about the total length of the interview) [ 20 ]. Qualitative interviews are usually not conducted in written format as it impedes on the interactive component of the method [ 20 ]. In comparison to written surveys, qualitative interviews have the advantage of being interactive and allowing for unexpected topics to emerge and to be taken up by the researcher. This can also help overcome a provider or researcher-centred bias often found in written surveys, which by nature, can only measure what is already known or expected to be of relevance to the researcher. Interviews can be audio- or video-taped; but sometimes it is only feasible or acceptable for the interviewer to take written notes [ 14 , 16 , 20 ].

Focus groups

Focus groups are group interviews to explore participants’ expertise and experiences, including explorations of how and why people behave in certain ways [ 1 ]. Focus groups usually consist of 6–8 people and are led by an experienced moderator following a topic guide or “script” [ 21 ]. They can involve an observer who takes note of the non-verbal aspects of the situation, possibly using an observation guide [ 21 ]. Depending on researchers’ and participants’ preferences, the discussions can be audio- or video-taped and transcribed afterwards [ 21 ]. Focus groups are useful for bringing together homogeneous (to a lesser extent heterogeneous) groups of participants with relevant expertise and experience on a given topic on which they can share detailed information [ 21 ]. Focus groups are a relatively easy, fast and inexpensive method to gain access to information on interactions in a given group, i.e. “the sharing and comparing” among participants [ 21 ]. Disadvantages include less control over the process and a lesser extent to which each individual may participate. Moreover, focus group moderators need experience, as do those tasked with the analysis of the resulting data. Focus groups can be less appropriate for discussing sensitive topics that participants might be reluctant to disclose in a group setting [ 13 ]. Moreover, attention must be paid to the emergence of “groupthink” as well as possible power dynamics within the group, e.g. when patients are awed or intimidated by health professionals.

Choosing the “right” method

As explained above, the school of thought underlying qualitative research assumes no objective hierarchy of evidence and methods. This means that each choice of single or combined methods has to be based on the research question that needs to be answered and a critical assessment with regard to whether or to what extent the chosen method can accomplish this – i.e. the “fit” between question and method [ 14 ]. It is necessary for these decisions to be documented when they are being made, and to be critically discussed when reporting methods and results.

Let us assume that our research aim is to examine the (clinical) processes around acute endovascular treatment (EVT), from the patient’s arrival at the emergency room to recanalization, with the aim to identify possible causes for delay and/or other causes for sub-optimal treatment outcome. As a first step, we could conduct a document study of the relevant standard operating procedures (SOPs) for this phase of care – are they up-to-date and in line with current guidelines? Do they contain any mistakes, irregularities or uncertainties that could cause delays or other problems? Regardless of the answers to these questions, the results have to be interpreted based on what they are: a written outline of what care processes in this hospital should look like. If we want to know what they actually look like in practice, we can conduct observations of the processes described in the SOPs. These results can (and should) be analysed in themselves, but also in comparison to the results of the document analysis, especially as regards relevant discrepancies. Do the SOPs outline specific tests for which no equipment can be observed or tasks to be performed by specialized nurses who are not present during the observation? It might also be possible that the written SOP is outdated, but the actual care provided is in line with current best practice. In order to find out why these discrepancies exist, it can be useful to conduct interviews. Are the physicians simply not aware of the SOPs (because their existence is limited to the hospital’s intranet) or do they actively disagree with them or does the infrastructure make it impossible to provide the care as described? Another rationale for adding interviews is that some situations (or all of their possible variations for different patient groups or the day, night or weekend shift) cannot practically or ethically be observed. In this case, it is possible to ask those involved to report on their actions – being aware that this is not the same as the actual observation. A senior physician’s or hospital manager’s description of certain situations might differ from a nurse’s or junior physician’s one, maybe because they intentionally misrepresent facts or maybe because different aspects of the process are visible or important to them. In some cases, it can also be relevant to consider to whom the interviewee is disclosing this information – someone they trust, someone they are otherwise not connected to, or someone they suspect or are aware of being in a potentially “dangerous” power relationship to them. Lastly, a focus group could be conducted with representatives of the relevant professional groups to explore how and why exactly they provide care around EVT. The discussion might reveal discrepancies (between SOPs and actual care or between different physicians) and motivations to the researchers as well as to the focus group members that they might not have been aware of themselves. For the focus group to deliver relevant information, attention has to be paid to its composition and conduct, for example, to make sure that all participants feel safe to disclose sensitive or potentially problematic information or that the discussion is not dominated by (senior) physicians only. The resulting combination of data collection methods is shown in Fig.  2 .

An external file that holds a picture, illustration, etc.
Object name is 42466_2020_59_Fig2_HTML.jpg

Possible combination of data collection methods

Attributions for icons: “Book” by Serhii Smirnov, “Interview” by Adrien Coquet, FR, “Magnifying Glass” by anggun, ID, “Business communication” by Vectors Market; all from the Noun Project

The combination of multiple data source as described for this example can be referred to as “triangulation”, in which multiple measurements are carried out from different angles to achieve a more comprehensive understanding of the phenomenon under study [ 22 , 23 ].

Data analysis

To analyse the data collected through observations, interviews and focus groups these need to be transcribed into protocols and transcripts (see Fig.  3 ). Interviews and focus groups can be transcribed verbatim , with or without annotations for behaviour (e.g. laughing, crying, pausing) and with or without phonetic transcription of dialects and filler words, depending on what is expected or known to be relevant for the analysis. In the next step, the protocols and transcripts are coded , that is, marked (or tagged, labelled) with one or more short descriptors of the content of a sentence or paragraph [ 2 , 15 , 23 ]. Jansen describes coding as “connecting the raw data with “theoretical” terms” [ 20 ]. In a more practical sense, coding makes raw data sortable. This makes it possible to extract and examine all segments describing, say, a tele-neurology consultation from multiple data sources (e.g. SOPs, emergency room observations, staff and patient interview). In a process of synthesis and abstraction, the codes are then grouped, summarised and/or categorised [ 15 , 20 ]. The end product of the coding or analysis process is a descriptive theory of the behavioural pattern under investigation [ 20 ]. The coding process is performed using qualitative data management software, the most common ones being InVivo, MaxQDA and Atlas.ti. It should be noted that these are data management tools which support the analysis performed by the researcher(s) [ 14 ].

An external file that holds a picture, illustration, etc.
Object name is 42466_2020_59_Fig3_HTML.jpg

From data collection to data analysis

Attributions for icons: see Fig. ​ Fig.2, 2 , also “Speech to text” by Trevor Dsouza, “Field Notes” by Mike O’Brien, US, “Voice Record” by ProSymbols, US, “Inspection” by Made, AU, and “Cloud” by Graphic Tigers; all from the Noun Project

How to report qualitative research?

Protocols of qualitative research can be published separately and in advance of the study results. However, the aim is not the same as in RCT protocols, i.e. to pre-define and set in stone the research questions and primary or secondary endpoints. Rather, it is a way to describe the research methods in detail, which might not be possible in the results paper given journals’ word limits. Qualitative research papers are usually longer than their quantitative counterparts to allow for deep understanding and so-called “thick description”. In the methods section, the focus is on transparency of the methods used, including why, how and by whom they were implemented in the specific study setting, so as to enable a discussion of whether and how this may have influenced data collection, analysis and interpretation. The results section usually starts with a paragraph outlining the main findings, followed by more detailed descriptions of, for example, the commonalities, discrepancies or exceptions per category [ 20 ]. Here it is important to support main findings by relevant quotations, which may add information, context, emphasis or real-life examples [ 20 , 23 ]. It is subject to debate in the field whether it is relevant to state the exact number or percentage of respondents supporting a certain statement (e.g. “Five interviewees expressed negative feelings towards XYZ”) [ 21 ].

How to combine qualitative with quantitative research?

Qualitative methods can be combined with other methods in multi- or mixed methods designs, which “[employ] two or more different methods [ …] within the same study or research program rather than confining the research to one single method” [ 24 ]. Reasons for combining methods can be diverse, including triangulation for corroboration of findings, complementarity for illustration and clarification of results, expansion to extend the breadth and range of the study, explanation of (unexpected) results generated with one method with the help of another, or offsetting the weakness of one method with the strength of another [ 1 , 17 , 24 – 26 ]. The resulting designs can be classified according to when, why and how the different quantitative and/or qualitative data strands are combined. The three most common types of mixed method designs are the convergent parallel design , the explanatory sequential design and the exploratory sequential design. The designs with examples are shown in Fig.  4 .

An external file that holds a picture, illustration, etc.
Object name is 42466_2020_59_Fig4_HTML.jpg

Three common mixed methods designs

In the convergent parallel design, a qualitative study is conducted in parallel to and independently of a quantitative study, and the results of both studies are compared and combined at the stage of interpretation of results. Using the above example of EVT provision, this could entail setting up a quantitative EVT registry to measure process times and patient outcomes in parallel to conducting the qualitative research outlined above, and then comparing results. Amongst other things, this would make it possible to assess whether interview respondents’ subjective impressions of patients receiving good care match modified Rankin Scores at follow-up, or whether observed delays in care provision are exceptions or the rule when compared to door-to-needle times as documented in the registry. In the explanatory sequential design, a quantitative study is carried out first, followed by a qualitative study to help explain the results from the quantitative study. This would be an appropriate design if the registry alone had revealed relevant delays in door-to-needle times and the qualitative study would be used to understand where and why these occurred, and how they could be improved. In the exploratory design, the qualitative study is carried out first and its results help informing and building the quantitative study in the next step [ 26 ]. If the qualitative study around EVT provision had shown a high level of dissatisfaction among the staff members involved, a quantitative questionnaire investigating staff satisfaction could be set up in the next step, informed by the qualitative study on which topics dissatisfaction had been expressed. Amongst other things, the questionnaire design would make it possible to widen the reach of the research to more respondents from different (types of) hospitals, regions, countries or settings, and to conduct sub-group analyses for different professional groups.

How to assess qualitative research?

A variety of assessment criteria and lists have been developed for qualitative research, ranging in their focus and comprehensiveness [ 14 , 17 , 27 ]. However, none of these has been elevated to the “gold standard” in the field. In the following, we therefore focus on a set of commonly used assessment criteria that, from a practical standpoint, a researcher can look for when assessing a qualitative research report or paper.

Assessors should check the authors’ use of and adherence to the relevant reporting checklists (e.g. Standards for Reporting Qualitative Research (SRQR)) to make sure all items that are relevant for this type of research are addressed [ 23 , 28 ]. Discussions of quantitative measures in addition to or instead of these qualitative measures can be a sign of lower quality of the research (paper). Providing and adhering to a checklist for qualitative research contributes to an important quality criterion for qualitative research, namely transparency [ 15 , 17 , 23 ].

Reflexivity

While methodological transparency and complete reporting is relevant for all types of research, some additional criteria must be taken into account for qualitative research. This includes what is called reflexivity, i.e. sensitivity to the relationship between the researcher and the researched, including how contact was established and maintained, or the background and experience of the researcher(s) involved in data collection and analysis. Depending on the research question and population to be researched this can be limited to professional experience, but it may also include gender, age or ethnicity [ 17 , 27 ]. These details are relevant because in qualitative research, as opposed to quantitative research, the researcher as a person cannot be isolated from the research process [ 23 ]. It may influence the conversation when an interviewed patient speaks to an interviewer who is a physician, or when an interviewee is asked to discuss a gynaecological procedure with a male interviewer, and therefore the reader must be made aware of these details [ 19 ].

Sampling and saturation

The aim of qualitative sampling is for all variants of the objects of observation that are deemed relevant for the study to be present in the sample “ to see the issue and its meanings from as many angles as possible” [ 1 , 16 , 19 , 20 , 27 ] , and to ensure “information-richness [ 15 ]. An iterative sampling approach is advised, in which data collection (e.g. five interviews) is followed by data analysis, followed by more data collection to find variants that are lacking in the current sample. This process continues until no new (relevant) information can be found and further sampling becomes redundant – which is called saturation [ 1 , 15 ] . In other words: qualitative data collection finds its end point not a priori , but when the research team determines that saturation has been reached [ 29 , 30 ].

This is also the reason why most qualitative studies use deliberate instead of random sampling strategies. This is generally referred to as “ purposive sampling” , in which researchers pre-define which types of participants or cases they need to include so as to cover all variations that are expected to be of relevance, based on the literature, previous experience or theory (i.e. theoretical sampling) [ 14 , 20 ]. Other types of purposive sampling include (but are not limited to) maximum variation sampling, critical case sampling or extreme or deviant case sampling [ 2 ]. In the above EVT example, a purposive sample could include all relevant professional groups and/or all relevant stakeholders (patients, relatives) and/or all relevant times of observation (day, night and weekend shift).

Assessors of qualitative research should check whether the considerations underlying the sampling strategy were sound and whether or how researchers tried to adapt and improve their strategies in stepwise or cyclical approaches between data collection and analysis to achieve saturation [ 14 ].

Good qualitative research is iterative in nature, i.e. it goes back and forth between data collection and analysis, revising and improving the approach where necessary. One example of this are pilot interviews, where different aspects of the interview (especially the interview guide, but also, for example, the site of the interview or whether the interview can be audio-recorded) are tested with a small number of respondents, evaluated and revised [ 19 ]. In doing so, the interviewer learns which wording or types of questions work best, or which is the best length of an interview with patients who have trouble concentrating for an extended time. Of course, the same reasoning applies to observations or focus groups which can also be piloted.

Ideally, coding should be performed by at least two researchers, especially at the beginning of the coding process when a common approach must be defined, including the establishment of a useful coding list (or tree), and when a common meaning of individual codes must be established [ 23 ]. An initial sub-set or all transcripts can be coded independently by the coders and then compared and consolidated after regular discussions in the research team. This is to make sure that codes are applied consistently to the research data.

Member checking

Member checking, also called respondent validation , refers to the practice of checking back with study respondents to see if the research is in line with their views [ 14 , 27 ]. This can happen after data collection or analysis or when first results are available [ 23 ]. For example, interviewees can be provided with (summaries of) their transcripts and asked whether they believe this to be a complete representation of their views or whether they would like to clarify or elaborate on their responses [ 17 ]. Respondents’ feedback on these issues then becomes part of the data collection and analysis [ 27 ].

Stakeholder involvement

In those niches where qualitative approaches have been able to evolve and grow, a new trend has seen the inclusion of patients and their representatives not only as study participants (i.e. “members”, see above) but as consultants to and active participants in the broader research process [ 31 – 33 ]. The underlying assumption is that patients and other stakeholders hold unique perspectives and experiences that add value beyond their own single story, making the research more relevant and beneficial to researchers, study participants and (future) patients alike [ 34 , 35 ]. Using the example of patients on or nearing dialysis, a recent scoping review found that 80% of clinical research did not address the top 10 research priorities identified by patients and caregivers [ 32 , 36 ]. In this sense, the involvement of the relevant stakeholders, especially patients and relatives, is increasingly being seen as a quality indicator in and of itself.

How not to assess qualitative research

The above overview does not include certain items that are routine in assessments of quantitative research. What follows is a non-exhaustive, non-representative, experience-based list of the quantitative criteria often applied to the assessment of qualitative research, as well as an explanation of the limited usefulness of these endeavours.

Protocol adherence

Given the openness and flexibility of qualitative research, it should not be assessed by how well it adheres to pre-determined and fixed strategies – in other words: its rigidity. Instead, the assessor should look for signs of adaptation and refinement based on lessons learned from earlier steps in the research process.

Sample size

For the reasons explained above, qualitative research does not require specific sample sizes, nor does it require that the sample size be determined a priori [ 1 , 14 , 27 , 37 – 39 ]. Sample size can only be a useful quality indicator when related to the research purpose, the chosen methodology and the composition of the sample, i.e. who was included and why.

Randomisation

While some authors argue that randomisation can be used in qualitative research, this is not commonly the case, as neither its feasibility nor its necessity or usefulness has been convincingly established for qualitative research [ 13 , 27 ]. Relevant disadvantages include the negative impact of a too large sample size as well as the possibility (or probability) of selecting “ quiet, uncooperative or inarticulate individuals ” [ 17 ]. Qualitative studies do not use control groups, either.

Interrater reliability, variability and other “objectivity checks”

The concept of “interrater reliability” is sometimes used in qualitative research to assess to which extent the coding approach overlaps between the two co-coders. However, it is not clear what this measure tells us about the quality of the analysis [ 23 ]. This means that these scores can be included in qualitative research reports, preferably with some additional information on what the score means for the analysis, but it is not a requirement. Relatedly, it is not relevant for the quality or “objectivity” of qualitative research to separate those who recruited the study participants and collected and analysed the data. Experiences even show that it might be better to have the same person or team perform all of these tasks [ 20 ]. First, when researchers introduce themselves during recruitment this can enhance trust when the interview takes place days or weeks later with the same researcher. Second, when the audio-recording is transcribed for analysis, the researcher conducting the interviews will usually remember the interviewee and the specific interview situation during data analysis. This might be helpful in providing additional context information for interpretation of data, e.g. on whether something might have been meant as a joke [ 18 ].

Not being quantitative research

Being qualitative research instead of quantitative research should not be used as an assessment criterion if it is used irrespectively of the research problem at hand. Similarly, qualitative research should not be required to be combined with quantitative research per se – unless mixed methods research is judged as inherently better than single-method research. In this case, the same criterion should be applied for quantitative studies without a qualitative component.

The main take-away points of this paper are summarised in Table ​ Table1. 1 . We aimed to show that, if conducted well, qualitative research can answer specific research questions that cannot to be adequately answered using (only) quantitative designs. Seeing qualitative and quantitative methods as equal will help us become more aware and critical of the “fit” between the research problem and our chosen methods: I can conduct an RCT to determine the reasons for transportation delays of acute stroke patients – but should I? It also provides us with a greater range of tools to tackle a greater range of research problems more appropriately and successfully, filling in the blind spots on one half of the methodological spectrum to better address the whole complexity of neurological research and practice.

Take-away-points

Acknowledgements

Abbreviations, authors’ contributions.

LB drafted the manuscript; WW and CG revised the manuscript; all authors approved the final versions.

no external funding.

Availability of data and materials

Ethics approval and consent to participate, consent for publication, competing interests.

The authors declare no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

{phrase('archive_name')}

  • Browse by Year
  • Browse by Subject
  • Advanced Search

Qualitative research methods in information systems: a call for phenomenon-focused problematization

qualitative research methods lse

Actions (login required)

-

Downloads per month over past year

View more statistics

Log in using your username and password

  • Search More Search for this keyword Advanced search
  • Latest content
  • Current issue
  • BMJ Journals More You are viewing from: Google Indexer

You are here

  • Volume 29, Issue 3
  • Rapid reviews methods series: guidance on rapid qualitative evidence synthesis
  • Article Text
  • Article info
  • Citation Tools
  • Rapid Responses
  • Article metrics

Download PDF

  • http://orcid.org/0000-0003-4808-3880 Andrew Booth 1 , 2 ,
  • Isolde Sommer 3 , 4 ,
  • http://orcid.org/0000-0003-4238-5984 Jane Noyes 2 , 5 ,
  • Catherine Houghton 2 , 6 ,
  • Fiona Campbell 1 , 7
  • The Cochrane Rapid Reviews Methods Group and Cochrane Qualitative and Implementation Methods Group (CQIMG)
  • 1 EnSyGN Sheffield Evidence Synthesis Group , University of Sheffield , Sheffield , UK
  • 2 Cochrane Qualitative and Implementation Methods Group (CQIMG) , London , UK
  • 3 Department for Evidence-based Medicine and Evaluation , University for Continuing Education Krems , Krems , Austria
  • 4 Cochrane Rapid Reviews Group & Cochrane Austria , Krems , Austria
  • 5 Bangor University , Bangor , UK
  • 6 University of Galway , Galway , Ireland
  • 7 University of Newcastle upon Tyne , Newcastle upon Tyne , UK
  • Correspondence to Professor Andrew Booth, Univ Sheffield, Sheffield, UK; a.booth{at}sheffield.ac.uk

This paper forms part of a series of methodological guidance from the Cochrane Rapid Reviews Methods Group and addresses rapid qualitative evidence syntheses (QESs), which use modified systematic, transparent and reproducible methodsu to accelerate the synthesis of qualitative evidence when faced with resource constraints. This guidance covers the review process as it relates to synthesis of qualitative research. ‘Rapid’ or ‘resource-constrained’ QES require use of templates and targeted knowledge user involvement. Clear definition of perspectives and decisions on indirect evidence, sampling and use of existing QES help in targeting eligibility criteria. Involvement of an information specialist, especially in prioritising databases, targeting grey literature and planning supplemental searches, can prove invaluable. Use of templates and frameworks in study selection and data extraction can be accompanied by quality assurance procedures targeting areas of likely weakness. Current Cochrane guidance informs selection of tools for quality assessment and of synthesis method. Thematic and framework synthesis facilitate efficient synthesis of large numbers of studies or plentiful data. Finally, judicious use of Grading of Recommendations Assessment, Development and Evaluation approach for assessing the Confidence of Evidence from Reviews of Qualitative research assessments and of software as appropriate help to achieve a timely and useful review product.

  • Systematic Reviews as Topic
  • Patient Care

Data availability statement

No data are available. Not applicable. All data is from published articles.

This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See:  http://creativecommons.org/licenses/by-nc/4.0/ .

https://doi.org/10.1136/bmjebm-2023-112620

Statistics from Altmetric.com

Request permissions.

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

WHAT IS ALREADY KNOWN ON THIS TOPIC

Rapid Qualitative Evidence Synthesis (QES) is a relatively recent innovation in evidence synthesis and few published examples currently exists.

Guidance for authoring a rapid QES is scattered and requires compilation and summary.

WHAT THIS STUDY ADDS

This paper represents the first attempt to compile current guidance, illustrated by the experience of several international review teams.

We identify features of rapid QES methods that could be accelerated or abbreviated and where methods resemble those for conventional QESs.

HOW THIS STUDY MIGHT AFFECT RESEARCH, PRACTICE OR POLICY

This paper offers guidance for researchers when conducting a rapid QES and informs commissioners of research and policy-makers what to expect when commissioning such a review.

Introduction

This paper forms part of a series from the Cochrane Rapid Reviews Methods Group providing methodological guidance for rapid reviews. While other papers in the series 1–4 focus on generic considerations, we aim to provide in-depth recommendations specific to a resource-constrained (or rapid) qualitative evidence synthesis (rQES). 5 This paper is accompanied by recommended resources ( online supplemental appendix A ) and an elaboration with practical considerations ( online supplemental appendix B ).

Supplemental material

The role of qualitative evidence in decision-making is increasingly recognised. 6 This, in turn, has led to appreciation of the value of qualitative evidence syntheses (QESs) that summarise findings across multiple contexts. 7 Recognition of the need for such syntheses to be available at the time most useful to decision-making has, in turn, driven demand for rapid qualitative evidence syntheses. 8 The breadth of potential rQES mirrors the versatility of QES in general (from focused questions to broad overviews) and outputs range from descriptive thematic maps through to theory-informed syntheses (see table 1 ).

  • View inline

Glossary of important terms (alphabetically)

As with other resource-constrained reviews, no one size fits all. A team should start by specifying the phenomenon of interest, the review question, 9 the perspectives to be included 9 and the sample to be determined and selected. 10 Subsequently, the team must finalise the appropriate choice of synthesis. 11 Above all, the review team should consider the intended knowledge users, 3 including requirements of the funder.

An rQES team, in particular, cannot afford any extra time or resource requirements that might arise from either a misunderstanding of the review question, an unclear picture of user requirements or an inappropriate choice of methods. The team seeks to align the review question and the requirements of the knowledge user with available time and resources. They also need to ensure that the choice of data and choice of synthesis are appropriate to the intended ‘knowledge claims’ (epistemology) made by the rQES. 11 This involves the team asking ‘what types of data are meaningful for this review question?’, ‘what types of data are trustworthy?’ and ‘is the favoured synthesis method appropriate for this type of data?’. 12 This paper aims to help rQES teams to choose methods that best fit their project while understanding the limitations of those choices. Our recommendations derive from current QES guidance, 5 evidence on modified QES methods, 8 13 and practical experience. 14 15

This paper presents an overview of considerations and recommendations as described in table 2 . Supplemental materials including additional resources details of our recommendations and practical examples are provided in online supplemental appendices A and B .

Recommendations for resource-constrained qualitative evidence synthesis (rQES)

Setting the review question and topic refinement

Rapid reviews summarise information from multiple research studies to produce evidence for ‘the public, researchers, policymakers and funders in a systematic, resource-efficient manner’. 16 Involvement of knowledge users is critical. 3 Given time constraints, individual knowledge users could be asked only to feedback on very specific decisions and tasks or on selective sections of the protocol. Specifically, whenever a QES is abbreviated or accelerated, a team should ensure that the review question is agreed by a minimum number of knowledge users with expertise or experience that reflects all the important review perspectives and with authority to approve the final version 2 5 11 ( table 2 , item R1).

Involvement of topic experts can ensure that the rQES is responsive to need. 14 17 One Cochrane rQES saved considerable time by agreeing the review topic within a single meeting and one-phase iteration. 9 Decisions on topics to be omitted are also informed by a knowledge of existing QESs. 17

An information specialist can help to manage the quantity and quality of available evidence by setting conceptual boundaries and logistic limits. A structured question format, such as Setting-Perspective-Interest, phenomenon of-Comparison-Evaluation or Population-Interest, phenomenon of-Context helps in communicating the scope and, subsequently, in operationalising study selection. 9 18

Scoping (of review parameters) and mapping (of key types of evidence and likely richness of data) helps when planning the review. 5 19 The option to choose purposive sampling over comprehensive sampling approaches, as offered by standard QES, may be particularly helpful in the context of a rapid QES. 8 Once a team knows the approximate number and distribution of studies, perhaps mapping them against country, age, ethnicity, etc), they can decide whether or not to use purposive sampling. 12 An rQES for the WHO combined purposive with variation sampling. Sampling in two stages started by reducing the initial number of studies to a more manageable sampling frame and then sampling approximately a third of the remaining studies from within the sampling frame. 20

Sampling may target richer studies and/or privilege diversity. 8 21 A rich qualitative study typically illustrates findings with verbatim extracts from transcripts from interviews or textual responses from questionnaires. Rich studies are often found in specialist qualitative research or social science journals. In contrast, less rich studies may itemise themes with an occasional indicative text extract and tend to summarise findings. In clinical or biomedical journals less rich findings may be placed within a single table or box.

No rule exists on an optimal number of studies; too many studies makes it challenging to ‘maintain insight’, 22 too few does not sustain rigorous analysis. 23 Guidance on sampling is available from the forthcoming Cochrane-Campbell QES Handbook.

A review team can use templates to fast-track writing of a protocol. The protocol should always be publicly available ( table 2 , item R2). 24 25 Formal registration may require that the team has not commenced data extraction but should be considered if it does not compromise the rQES timeframe. Time pressures may require that methods are left suitably flexible to allow well-justified changes to be made as a detailed picture of the studies and data emerge. 26 The first Cochrane rQES drew heavily on text from a joint protocol/review template previously produced within Cochrane. 24

Setting eligibility criteria

An rQES team may need to limit the number of perspectives, focusing on those most important for decision-making 5 9 27 ( table 2 , item R3). Beyond the patients/clients each additional perspective (eg, family members, health professionals, other professionals, etc) multiplies the additional effort involved.

A rapid QES may require strict date and setting restrictions 17 and language restrictions that accommodate the specific requirements of the review. Specifically, the team should consider whether changes in context over time or substantive differences between geographical regions could be used to justify a narrower date range or a limited coverage of countries and/or languages. The team should also decide if ‘indirect evidence’ is to substitute for the absence of direct evidence. An rQES typically focuses on direct evidence, except when only indirect evidence is available 28 ( table 2 , item R4). Decisions on relevance are challenging—precautions for swine influenza may inform precautions for bird influenza. 28 A smoking ban may operate similarly to seat belt legislation, etc. A review team should identify where such shared mechanisms might operate. 28 An rQES team must also decide whether to use frameworks or models to focus the review. Theories may be unearthed within the topic search or be already known to team members, fro example, Theory of Planned Behaviour. 29

Options for managing the quantity and quality of studies and data emerge during the scoping (see above). In summary, the review team should consider privileging rich qualitative studies 2 ; consider a stepwise approach to inclusion of qualitative data and explore the possibility of sampling ( table 2 , item R5). For example, where data is plentiful an rQES may be limited to qualitative research and/or to mixed methods studies. Where data is less plentiful then surveys or other qualitative data sources may need to be included. Where plentiful reviews already exist, a team may decide to conduct a review of reviews 5 by including multiple QES within a mega-synthesis 28 29 ( table 2 , item R6).

Searching for QES merits its own guidance, 21–23 30 this section reinforces important considerations from guidance specific to qualitative research. Generic guidance for rapid reviews in this series broadly applies to rapid QESs. 1

In addition to journal articles, by far the most plentiful source, qualitative research is found in book chapters, theses and in published and unpublished reports. 21 Searches to support an rQES can (a) limit the number of databases searched, deliberately selecting databases from diverse disciplines, (b) use abbreviated study filters to retrieve qualitative designs and (c) employ high yield complementary methods (eg, reference checking, citation searching and Related Articles features). An information specialist (eg, librarian) should be involved in prioritising sources and search methods ( table 2 , item R7). 11 14

According to empirical evidence optimal database combinations include Scopus plus CINAHL or Scopus plus ProQuest Dissertations and Theses Global (two-database combinations) and Scopus plus CINAHL plus ProQuest Dissertations and Theses Global (three-database combination) with both choices retrieving between 89% and 92% of relevant studies. 30

If resources allow, searches should include one or two specialised databases ( table 2 , item R8) from different disciplines or contexts 21 (eg, social science databases, specialist discipline databases or regional or institutional repositories). Even when resources are limited, the information specialist should factor in time for peer review of at least one search strategy ( table 2 , item R9). 31 Searches for ‘grey literature’ should selectively target appropriate types of grey literature (such as theses or process evaluations) and supplemental searches, including citation chaining or Related Articles features ( table 2 , item R10). 32 The first Cochrane rQES reported that searching reference lists of key papers yielded an extra 30 candidate papers for review. However, the team documented exclusion of grey literature as a limitation of their review. 15

Study selection

Consistency in study selection is achieved by using templates, by gaining a shared team understanding of the audience and purpose, and by ongoing communication within, and beyond, the team. 2 33 Individuals may work in parallel on the same task, as in the first Cochrane rQES, or follow a ‘segmented’ approach where each reviewer is allocated a different task. 14 The use of machine learning in the specific context of rQES remains experimental. However, the possibility of developing qualitative study classifiers comparable to those for randomised controlled trials offers an achievable aspiration. 34

Title and abstract screening

The entire screening team should use pre-prepared, pretested title and abstract templates to limit the scale of piloting, calibration and testing ( table 2 , item R11). 1 14 The first Cochrane rQES team double-screened titles and abstracts within Covidence review software. 14 Disagreements were resolved with reference to a third reviewer achieving a shared understanding of the eligibility criteria and enhancing familiarity with target studies and insight from data. 14 The team should target and prioritise identified risks of either over-zealous inclusion or over-exclusion specific to each rQES ( table 2 , item R12). 14 The team should maximise opportunities to capture divergent views and perspectives within study findings. 35

Full-text screening

Full-text screening similarly benefits from using a pre-prepared pretested standardised template where possible 1 14 ( table 2 , item R11). If a single reviewer undertakes full-text screening, 8 the team should identify likely risks to trustworthiness of findings and focus quality control procedures (eg, use of additional reviewers and percentages for double screening) on specific threats 14 ( table 2 , item R13). The Cochrane rQES team opted for double screening to assist their immersion within the topic. 14

Data extraction

Data extraction of descriptive/contextual data may be facilitated by review management software (eg, EPPI-Reviewer) or home-made approaches using Google Forms, or other survey software. 36 Where extraction of qualitative findings requires line-by-line coding with multiple iterations of the data then a qualitative data management analysis package, such as QSR NVivo, reaps dividends. 36 The team must decide if, collectively, they favour extracting data to a template or coding direct within an electronic version of an article.

Quality control must be fit for purpose but not excessive. Published examples typically use a single reviewer for data extraction 8 with use of two independent reviewers being the exception. The team could limit data extraction to minimal essential items. They may also consider re-using descriptive details and findings previously extracted within previous well-conducted QES ( table 2 , item R14). A pre-existing framework, where readily identified, may help to structure the data extraction template. 15 37 The same framework may be used to present the findings. Some organisations may specify a preferred framework, such as an evidence-to-decision-making framework. 38

Assessment of methodological limitations

The QES community assess ‘methodological limitations’ rather than use ‘risk of bias’ terminology. An rQES team should pick an approach appropriate to their specific review. For example, a thematic map may not require assessment of individual studies—a brief statement of the generic limitations of the set of studies may be sufficient. However, for any synthesis that underpins practice recommendations 39 assessment of included studies is integral to the credibility of findings. In any decision-making context that involves recommendations or guidelines, an assessment of methodological limitations is mandatory. 40 41

Each review team should work with knowledge users to determine a review-specific approach to quality assessment. 27 While ‘traffic lights’, similar to the outputs from the Cochrane Risk of Bias tool, may facilitate rapid interpretation, accompanying textual notes are invaluable in highlighting specific areas for concern. In particular, the rQES team should demonstrate that they are aware (a) that research designs for qualitative research seek to elicit divergent views, rather than control for variation; (b) that, for qualitative research, the selection of the sample is far more informative than the size of the sample; and (c) that researchers from primary research, and equally reviewers for the qualitative synthesis, need to be thoughtful and reflexive about their possible influences on interpretation of either the primary data or the synthesised findings.

Selection of checklist

Numerous scales and checklists exist for assessing the quality of qualitative studies. In the absence of validated risk of bias tools for qualitative studies, the team should choose a tool according to Cochrane Qualitative and Implementation Methods Group (CQIMG) guidance together with expediency (according to ease of use, prior familiarity, etc) ( table 2 , item R15). 41 In comparison to the Critical Appraisal Skills Programme checklist which was never designed for use in synthesis, 42 the Cochrane qualitative tool is similarly easy to use and was designed for QES use. Work is underway to identify an assessment process that is compatible with QESs that support decision-making. 41 For now the choice of a checklist remains determined by interim Cochrane guidance and, beyond this, by personal preference and experience. For an rQES a team could use a single reviewer to assess methodological limitations, with verification of judgements (and support statements) by a second reviewer ( table 2 , item R16).

The CQIMG endorses three types of synthesis; thematic synthesis, framework synthesis and meta-ethnography ( box 1 ). 43 44 Rapid QES favour descriptive thematic synthesis 45 or framework synthesis, 46 47 except when theory generation (meta-ethnography 48 49 or analytical thematic synthesis) is a priority ( table 2 , item R17).

Choosing a method for rapid qualitative synthesis

Thematic synthesis: first choice method for rQES. 45 For example, in their rapid QES Crooks and colleagues 44 used a thematic synthesis to understand the experiences of both academic and lived experience coresearchers within palliative and end of life research. 45

Framework synthesis: alternative where a suitable framework can be speedily identified. 46 For example, Bright and colleagues 46 considered ‘best-fit framework synthesis’ as appropriate for mapping study findings to an ‘a priori framework of dimensions measured by prenatal maternal anxiety tools’ within their ‘streamlined and time-limited evidence review’. 47

Less commonly, an adapted meta-ethnographical approach was used for an implementation model of social distancing where supportive data (29 studies) was plentiful. 48 However, this QES demonstrates several features that subsequently challenge its original identification as ‘rapid’. 49

Abbrevations: QES, qualitative evidence synthesis; rQES, resource-constrained qualitative evidence synthesis.

The team should consider whether a conceptual model, theory or framework offers a rapid way for organising, coding, interpreting and presenting findings ( table 2 , item R18). If the extracted data appears rich enough to sustain further interpretation, data from a thematic or framework synthesis can subsequently be explored within a subsequent meta-ethnography. 43 However, this requires a team with substantial interpretative expertise. 11

Assessments of confidence in the evidence 4 are central to any rQES that seeks to support decision-making and the QES-specific Grading of Recommendations Assessment, Development and Evaluation approach for assessing the Confidence of Evidence from Reviews of Qualitative research (GRADE-CERQual) approach is designed to assess confidence in qualitative evidence. 50 This can be performed by a single reviewer, confirmed by a second reviewer. 26 Additional reviewers could verify all, or a sample of, assessments. For a rapid assessment a team must prioritise findings, using objective criteria; a WHO rQES focused only on the three ‘highly synthesised findings’. 20 The team could consider reusing GRADE-CERQual assessments from published QESs if findings are relevant and of demonstrable high quality ( table 2 , item R19). 50 No rapid approach to full application of GRADE-CERQual currently exists.

Reporting and record management

Little is written on optimal use of technology. 8 A rapid review is not a good time to learn review management software or qualitative analysis management software. Using such software for all general QES processes ( table 2 , item R20), and then harnessing these skills and tools when specifically under resource pressures, is a sounder strategy. Good file labelling and folder management and a ‘develop once, re-use multi-times’ approach facilitates resource savings.

Reporting requirements include the meta-ethnography reporting guidance (eMERGe) 51 and the Enhancing transparency in reporting the synthesis of qualitative research (ENTREQ) statement. 52 An rQES should describe limitations and their implications for confidence in the evidence even more thoroughly than a regular QES; detailing the consequences of fast-tracking, streamlining or of omitting processes all together. 8 Time spent documenting reflexivity is similarly important. 27 If QES methodology is to remain credible rapid approaches must be applied with insight and documented with circumspection. 53 54 (56)

Ethics statements

Patient consent for publication.

Not applicable.

Ethics approval

  • Klerings I ,
  • Robalino S ,
  • Booth A , et al
  • Nussbaumer-Streit B ,
  • Hamel C , et al
  • Garritty C ,
  • Tricco AC ,
  • Smith M , et al
  • Gartlehner G ,
  • Devane D , et al
  • NHS Scotland
  • Campbell F ,
  • Flemming K , et al
  • Glenton C ,
  • Lubarsky S ,
  • Varpio L , et al
  • Meskell P ,
  • Glenton C , et al
  • Houghton C ,
  • Delaney H , et al
  • Beecher C ,
  • Maeso B , et al
  • McKenzie JE , et al
  • Harris JL ,
  • Cargo M , et al
  • Varley-Campbell J , et al
  • Downe S , et al
  • Shamseer L ,
  • Clarke M , et al
  • Nussbaumer-Streit B , et al
  • Finlayson KW ,
  • Lawrie TA , et al
  • Lewin S , et al
  • Frandsen TF ,
  • Gildberg FA ,
  • Tingleff EB
  • Mshelia S ,
  • Analo CV , et al
  • Husk K , et al
  • Carmona C ,
  • Carroll C ,
  • Ilott I , et al
  • Meehan B , et al
  • Munthe-Kaas H ,
  • Bohren MA ,
  • Munthe-Kaas HM ,
  • French DP ,
  • Flemming K ,
  • Garside R , et al
  • Shulman C , et al
  • Dixon-Woods M
  • Bright KS ,
  • Norris JM ,
  • Letourneau NL , et al
  • Sadjadi M ,
  • Mörschel KS ,
  • Petticrew M
  • France EF ,
  • Cunningham M ,
  • Ring N , et al
  • McInnes E , et al
  • Britten N ,
  • Garside R ,
  • Pope C , et al

Supplementary materials

Supplementary data.

This web only file has been produced by the BMJ Publishing Group from an electronic file supplied by the author(s) and has not been edited for content.

  • Data supplement 1

Correction notice Since this paper first published, updates have been made to the left hand column of table 2.

Contributors All authors (AB, IS, JN, CH, FC) have made substantial contributions to the conception and design of the guidance document. AB led on drafting the work and revising it critically for important intellectual content. All other authors (IS, JN, CH, FC) contributed to revisions of the document. All authors (AB, IS, JN, CH, FC) have given final approval of the version to be published. As members of the Cochrane Qualitative and Implementation Methods Group and/or the Cochrane Rapid Reviews Methods Group all authors (AB, IS, JN, CH, FC) agree to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved.

Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.

Competing interests AB is co-convenor of the Cochrane Qualitative and Implementation Methods Group. In the last 36 months, he received royalties from Systematic Approaches To a Successful Literature Review (Sage 3rd edition), honoraria from the Agency for Healthcare Research and Quality, and travel support from the WHO. JN is lead convenor of the Cochrane Qualitative and Implementation Methods Group. In the last 36 months, she has received honoraria from the Agency for Healthcare Research and Quality and travel support from the WHO. CH is co-convenor of the Cochrane Qualitative and Implementation Methods Group.

Patient and public involvement Patients and/or the public were not involved in the design, or conduct, or reporting, or dissemination plans of this research.

Provenance and peer review Not commissioned; internally peer reviewed.

Supplemental material This content has been supplied by the author(s). It has not been vetted by BMJ Publishing Group Limited (BMJ) and may not have been peer-reviewed. Any opinions or recommendations discussed are solely those of the author(s) and are not endorsed by BMJ. BMJ disclaims all liability and responsibility arising from any reliance placed on the content. Where the content includes any translated material, BMJ does not warrant the accuracy and reliability of the translations (including but not limited to local regulations, clinical guidelines, terminology, drug names and drug dosages), and is not responsible for any error and/or omissions arising from translation and adaptation or otherwise.

Read the full text or download the PDF:

Ohio State nav bar

The Ohio State University

  • BuckeyeLink
  • Find People
  • Search Ohio State

2024 Speakers

qualitative research methods lse

Chris Fields , Tufts University, USA Chris Fields, PhD is a Researcher at Tufts University and a private consultant. His current work focuses on quantum-information based models of multi-observer communication, shared system identification, and consistent embedding of identified systems in a spatial geometry.  These models have allowed reformulating the Free Energy Principle (FEP) of Karl Friston and colleagues in the language of quantum information theory and the systematic exploration of qualitative differences between classical and quantum formulations of the FEP.  They have also provided a first-principles understanding of compartmentalization, high fan-in/fan-out morphology, and hierarchical control in biological systems at multiple scales.  For recent publications, see https://chrisfieldsresearch.com/.

qualitative research methods lse

Jay L. Garfield , Smith College, USA Jay L. Garfield is Doris Silbert Professor in the Humanities and Professor of Philosophy and Buddhist Studies at Smith College, Visiting Professor of Buddhist philosophy at Harvard Divinity School, Professor of Philosophy at Melbourne University and Adjunct Professor of Philosophy at the Central Institute of Higher Tibetan Studies. Academicinfluence.com has identified him as one of the 50 most influential philosophers in the world over the past decade.

Garfield’s research addresses topics in the foundations of cognitive science and the philosophy of mind; metaphysics; the history of modern Indian philosophy; topics in ethics, epistemology and the philosophy of logic; the philosophy of the Scottish enlightenment methodology in cross-cultural interpretation; and topics in Buddhist philosophy, particularly Indo-Tibetan Madhyamaka and Yogācāra. He is the author or editor of over 30 books and over 200 articles, chapters, and reviews.

qualitative research methods lse

COMMENTS

  1. MY421 Qualitative Research Methods

    It prepares students to design, carry out, report, read and evaluate qualitative research projects. First, students learn how to collect data using methods including interviews, focus groups participant observation, and selecting documents and new media data. Second, we cover analysis, using thematic and discourse analysis.

  2. Short course: Qualitative Research Methods

    It covers the full cycle of a field-based qualitative research project: from design, to data collection, analysis, reporting and dissemination. The course has the dual aims of equipping students with conceptual understandings of current academic debates regarding methods, and the practical skills to put those methods into practice.

  3. MSc Social Research Methods

    You will acquire skills of 'practical scholarship' and the ability to design, conduct, analyse and report a social research project. The MSc Social Research Methods also offers more specialised streams in Population and Gender. The syllabus for the MSc goes some way beyond the ESRC's requirements for the first year of a 1+3 PhD programme ...

  4. MY521 Qualitative Research Methods

    It prepares students to design, carry out, report, read and evaluate qualitative research projects. First, students learn how to collect data using methods including interviews, focus groups participant observation, and selecting documents and new media data. Second, we cover analysis, using thematic and discourse analysis.

  5. Department of Methodology

    Department of Methodology. The Department is an international centre of excellence in social science methodology. We offer postgraduate programmes in social research methods, applied social data science and demography. We also run courses for students across LSE covering research design, qualitative, quantitative and computational methods.

  6. SO492 Qualitative Social Research Methods

    SO492. Half Unit. Qualitative Social Research Methods. This information is for the 2021/22 session. Dr Carrie Friese STC S213. This course is compulsory on the MSc in Culture and Society. This course is available on the MPhil/PhD in Cities Programme, MPhil/PhD in Sociology, MSc in City Design and Social Science, MSc in Economy and Society, MSc ...

  7. Methods Short Courses

    Methodology courses for PhD students. We're offering workshops under MY530 and MY560 on a variety of specific and advanced topics in qualitative and quantitative research methods. These workshops are primarily aimed at PhD students but may be taken by Masters students. The workshops are non-examinable, self-contained and may be attended ...

  8. Book Review: Doing Qualitative Research: A Practical Handbook

    This book will be of great use to students studying research methods, and will give them a thorough and readable introduction to what can sometimes feel like a rather overwhelming subject, concludes Sally Brown. This originally appeared on LSE Review of Books. Doing Qualitative Research: A Practical Handbook. 4 th Edition. David Silverman.

  9. Summary of ME305 Qualitative Research Methods

    It covers the full cycle of a field-based qualitative research project: from design, to data collection, analysis, reporting and dissemination. The course has the dual aims of equipping students with conceptual understandings of current academic debates regarding methods, and the practical skills to put those methods into practice.

  10. MSc Social Research Methods

    Information for prospective students MSc Social Research Methods is now open for 2023/24 applications. Visit the MSc in Social Research Methods online prospectus page.. MSc Social Research Methods will provide you with the opportunity to develop sophistication in research design and quantitative and qualitative research, and to undertake courses in one or more social science disciplines.

  11. Book Review: Doing Qualitative Research: A Practical ...

    In the fourth edition of his best-selling textbook, David Silverman provides a step-by-step guide to planning and conducting qualitative research. Using real examples from real postgraduate students, the book aims to make it easy to link theory to methods and shows how to move from understanding the principles of qualitative research to doing it yourself.

  12. Collect and analyse data

    Data analysis. NVivo, SPSS, R, and Stata are all installed as standard on LSE PCs, and you can apply for a licence for your own computer via the DTS specialist software page . NVivo can be used in qualitative and mixed methods research to analyse and code text, as well as manage survey and interview data. Check out the Digital Skills Lab online ...

  13. Research Methods

    Research Methods, Data Science and Mathematics at LSE Summer School offer you a slew of state-of-the-art courses. From statistics and machine learning to real analysis to qualitative research, our courses will give you a rock-solid grounding in the techniques, the tools and the methods you will need to ensure success. ...

  14. Planning Qualitative Research: Design and Decision Making for New

    While many books and articles guide various qualitative research methods and analyses, there is currently no concise resource that explains and differentiates among the most common qualitative approaches. We believe novice qualitative researchers, students planning the design of a qualitative study or taking an introductory qualitative research course, and faculty teaching such courses can ...

  15. Department of Methodology LSE

    The Departmental of Methodology's key function is to provide training for PhD and MSc students and LSE staff in the design of social research and in qualitative and quantitative analysis For more ...

  16. Qualitative and quantitative research are fundamentally ...

    Qualitative and quantitative are qualifiers that apply to data. To speak of qualitative and quantitative research, methods or a variety of other aspects of inquiry may make for interesting academic pursuit and worthy discussion however it all eventually comes down to data and interpreting them.

  17. What Is Qualitative Research?

    Qualitative research involves collecting and analyzing non-numerical data (e.g., text, video, or audio) to understand concepts, opinions, or experiences. It can be used to gather in-depth insights into a problem or generate new ideas for research. Qualitative research is the opposite of quantitative research, which involves collecting and ...

  18. Qualitative methods overview

    The social care evidence base reveals a distinct preference for qualitative methods covering a broad range of social care topics. This review provides an introduction to the different ways in which qualitative research has been used in social care and some of the reasons why it has been successful in identifying under-researched areas, in documenting the experiences of people using services ...

  19. Carrying out qualitative research under lockdown

    It may also affect the way we go about conducting research. Many researchers are having to suspend data collection or re-design their projects taking into account social-distancing measures. Much qualitative research typically relies on face-to-face interaction for data collection through interviews, focus groups and field work.

  20. How to use and assess qualitative research methods

    Abstract. This paper aims to provide an overview of the use and assessment of qualitative research methods in the health sciences. Qualitative research can be defined as the study of the nature of phenomena and is especially appropriate for answering questions of why something is (not) observed, assessing complex multi-component interventions ...

  21. Qualitative research: experience in using semi ...

    Horton, Joanne, Macve, Richard and Struyven, Geert (2004) Qualitative research: experience in using semi-structured interviews. In: Humphrey, Christopher and Lee, Bill H. K., (eds.)The Real Life Guide to Accounting Research: a Behind-The-Scenes View of Using Qualitative Research Methods. Elsevier Science (Firm), Amsterdam, The Netherlands, pp. 339-358.

  22. Qualitative research methods in information ...

    Qualitative research methods in information systems: a call for phenomenon-focused problematization Monteiro, Eric , Constantinides, Panos , Scott, Susan V. ORCID: 0000-0002-8775-9364 , Shaikh, Maha and Burton-Jones, Andrew (2022) Qualitative research methods in information systems: a call for phenomenon-focused problematization.

  23. PDF Research, Interrupted COVID- í9s impact on LSE PhD students employing

    funded by an LSE Studentship. Her research investigates how people speak about and Zdo [ race in a supposedly post-race society. To do this, she employs qualitative methods to study Znarratives of whiteness [ in German primary schools and how these impact schooling. Sarahs research interests include critical race theory,

  24. Rapid reviews methods series: guidance on rapid qualitative evidence

    This paper forms part of a series of methodological guidance from the Cochrane Rapid Reviews Methods Group and addresses rapid qualitative evidence syntheses (QESs), which use modified systematic, transparent and reproducible methodsu to accelerate the synthesis of qualitative evidence when faced with resource constraints. This guidance covers the review process as it relates to synthesis of ...

  25. 2024 Speakers

    Jakub Tesař, Charles University, Prague. Jakub Tesař is an assistant professor at Charles University in Prague. His research is focused on possible applications of quantum theory in the social sciences, namely the decision-making in the political (survey effects) and strategic context (game theory). He is interested in relational (as opposed ...