Analyst Answers

Data & Finance for Work & Life

data analysis types, methods, and techniques tree diagram

Data Analysis: Types, Methods & Techniques (a Complete List)

( Updated Version )

While the term sounds intimidating, “data analysis” is nothing more than making sense of information in a table. It consists of filtering, sorting, grouping, and manipulating data tables with basic algebra and statistics.

In fact, you don’t need experience to understand the basics. You have already worked with data extensively in your life, and “analysis” is nothing more than a fancy word for good sense and basic logic.

Over time, people have intuitively categorized the best logical practices for treating data. These categories are what we call today types , methods , and techniques .

This article provides a comprehensive list of types, methods, and techniques, and explains the difference between them.

For a practical intro to data analysis (including types, methods, & techniques), check out our Intro to Data Analysis eBook for free.

Descriptive, Diagnostic, Predictive, & Prescriptive Analysis

If you Google “types of data analysis,” the first few results will explore descriptive , diagnostic , predictive , and prescriptive analysis. Why? Because these names are easy to understand and are used a lot in “the real world.”

Descriptive analysis is an informational method, diagnostic analysis explains “why” a phenomenon occurs, predictive analysis seeks to forecast the result of an action, and prescriptive analysis identifies solutions to a specific problem.

That said, these are only four branches of a larger analytical tree.

Good data analysts know how to position these four types within other analytical methods and tactics, allowing them to leverage strengths and weaknesses in each to uproot the most valuable insights.

Let’s explore the full analytical tree to understand how to appropriately assess and apply these four traditional types.

Tree diagram of Data Analysis Types, Methods, and Techniques

Here’s a picture to visualize the structure and hierarchy of data analysis types, methods, and techniques.

If it’s too small you can view the picture in a new tab . Open it to follow along!

type of research analysis

Note: basic descriptive statistics such as mean , median , and mode , as well as standard deviation , are not shown because most people are already familiar with them. In the diagram, they would fall under the “descriptive” analysis type.

Tree Diagram Explained

The highest-level classification of data analysis is quantitative vs qualitative . Quantitative implies numbers while qualitative implies information other than numbers.

Quantitative data analysis then splits into mathematical analysis and artificial intelligence (AI) analysis . Mathematical types then branch into descriptive , diagnostic , predictive , and prescriptive .

Methods falling under mathematical analysis include clustering , classification , forecasting , and optimization . Qualitative data analysis methods include content analysis , narrative analysis , discourse analysis , framework analysis , and/or grounded theory .

Moreover, mathematical techniques include regression , Nïave Bayes , Simple Exponential Smoothing , cohorts , factors , linear discriminants , and more, whereas techniques falling under the AI type include artificial neural networks , decision trees , evolutionary programming , and fuzzy logic . Techniques under qualitative analysis include text analysis , coding , idea pattern analysis , and word frequency .

It’s a lot to remember! Don’t worry, once you understand the relationship and motive behind all these terms, it’ll be like riding a bike.

We’ll move down the list from top to bottom and I encourage you to open the tree diagram above in a new tab so you can follow along .

But first, let’s just address the elephant in the room: what’s the difference between methods and techniques anyway?

Difference between methods and techniques

Though often used interchangeably, methods ands techniques are not the same. By definition, methods are the process by which techniques are applied, and techniques are the practical application of those methods.

For example, consider driving. Methods include staying in your lane, stopping at a red light, and parking in a spot. Techniques include turning the steering wheel, braking, and pushing the gas pedal.

Data sets: observations and fields

It’s important to understand the basic structure of data tables to comprehend the rest of the article. A data set consists of one far-left column containing observations, then a series of columns containing the fields (aka “traits” or “characteristics”) that describe each observations. For example, imagine we want a data table for fruit. It might look like this:

Now let’s turn to types, methods, and techniques. Each heading below consists of a description, relative importance, the nature of data it explores, and the motivation for using it.

Quantitative Analysis

  • It accounts for more than 50% of all data analysis and is by far the most widespread and well-known type of data analysis.
  • As you have seen, it holds descriptive, diagnostic, predictive, and prescriptive methods, which in turn hold some of the most important techniques available today, such as clustering and forecasting.
  • It can be broken down into mathematical and AI analysis.
  • Importance : Very high . Quantitative analysis is a must for anyone interesting in becoming or improving as a data analyst.
  • Nature of Data: data treated under quantitative analysis is, quite simply, quantitative. It encompasses all numeric data.
  • Motive: to extract insights. (Note: we’re at the top of the pyramid, this gets more insightful as we move down.)

Qualitative Analysis

  • It accounts for less than 30% of all data analysis and is common in social sciences .
  • It can refer to the simple recognition of qualitative elements, which is not analytic in any way, but most often refers to methods that assign numeric values to non-numeric data for analysis.
  • Because of this, some argue that it’s ultimately a quantitative type.
  • Importance: Medium. In general, knowing qualitative data analysis is not common or even necessary for corporate roles. However, for researchers working in social sciences, its importance is very high .
  • Nature of Data: data treated under qualitative analysis is non-numeric. However, as part of the analysis, analysts turn non-numeric data into numbers, at which point many argue it is no longer qualitative analysis.
  • Motive: to extract insights. (This will be more important as we move down the pyramid.)

Mathematical Analysis

  • Description: mathematical data analysis is a subtype of qualitative data analysis that designates methods and techniques based on statistics, algebra, and logical reasoning to extract insights. It stands in opposition to artificial intelligence analysis.
  • Importance: Very High. The most widespread methods and techniques fall under mathematical analysis. In fact, it’s so common that many people use “quantitative” and “mathematical” analysis interchangeably.
  • Nature of Data: numeric. By definition, all data under mathematical analysis are numbers.
  • Motive: to extract measurable insights that can be used to act upon.

Artificial Intelligence & Machine Learning Analysis

  • Description: artificial intelligence and machine learning analyses designate techniques based on the titular skills. They are not traditionally mathematical, but they are quantitative since they use numbers. Applications of AI & ML analysis techniques are developing, but they’re not yet mainstream enough to show promise across the field.
  • Importance: Medium . As of today (September 2020), you don’t need to be fluent in AI & ML data analysis to be a great analyst. BUT, if it’s a field that interests you, learn it. Many believe that in 10 year’s time its importance will be very high .
  • Nature of Data: numeric.
  • Motive: to create calculations that build on themselves in order and extract insights without direct input from a human.

Descriptive Analysis

  • Description: descriptive analysis is a subtype of mathematical data analysis that uses methods and techniques to provide information about the size, dispersion, groupings, and behavior of data sets. This may sounds complicated, but just think about mean, median, and mode: all three are types of descriptive analysis. They provide information about the data set. We’ll look at specific techniques below.
  • Importance: Very high. Descriptive analysis is among the most commonly used data analyses in both corporations and research today.
  • Nature of Data: the nature of data under descriptive statistics is sets. A set is simply a collection of numbers that behaves in predictable ways. Data reflects real life, and there are patterns everywhere to be found. Descriptive analysis describes those patterns.
  • Motive: the motive behind descriptive analysis is to understand how numbers in a set group together, how far apart they are from each other, and how often they occur. As with most statistical analysis, the more data points there are, the easier it is to describe the set.

Diagnostic Analysis

  • Description: diagnostic analysis answers the question “why did it happen?” It is an advanced type of mathematical data analysis that manipulates multiple techniques, but does not own any single one. Analysts engage in diagnostic analysis when they try to explain why.
  • Importance: Very high. Diagnostics are probably the most important type of data analysis for people who don’t do analysis because they’re valuable to anyone who’s curious. They’re most common in corporations, as managers often only want to know the “why.”
  • Nature of Data : data under diagnostic analysis are data sets. These sets in themselves are not enough under diagnostic analysis. Instead, the analyst must know what’s behind the numbers in order to explain “why.” That’s what makes diagnostics so challenging yet so valuable.
  • Motive: the motive behind diagnostics is to diagnose — to understand why.

Predictive Analysis

  • Description: predictive analysis uses past data to project future data. It’s very often one of the first kinds of analysis new researchers and corporate analysts use because it is intuitive. It is a subtype of the mathematical type of data analysis, and its three notable techniques are regression, moving average, and exponential smoothing.
  • Importance: Very high. Predictive analysis is critical for any data analyst working in a corporate environment. Companies always want to know what the future will hold — especially for their revenue.
  • Nature of Data: Because past and future imply time, predictive data always includes an element of time. Whether it’s minutes, hours, days, months, or years, we call this time series data . In fact, this data is so important that I’ll mention it twice so you don’t forget: predictive analysis uses time series data .
  • Motive: the motive for investigating time series data with predictive analysis is to predict the future in the most analytical way possible.

Prescriptive Analysis

  • Description: prescriptive analysis is a subtype of mathematical analysis that answers the question “what will happen if we do X?” It’s largely underestimated in the data analysis world because it requires diagnostic and descriptive analyses to be done before it even starts. More than simple predictive analysis, prescriptive analysis builds entire data models to show how a simple change could impact the ensemble.
  • Importance: High. Prescriptive analysis is most common under the finance function in many companies. Financial analysts use it to build a financial model of the financial statements that show how that data will change given alternative inputs.
  • Nature of Data: the nature of data in prescriptive analysis is data sets. These data sets contain patterns that respond differently to various inputs. Data that is useful for prescriptive analysis contains correlations between different variables. It’s through these correlations that we establish patterns and prescribe action on this basis. This analysis cannot be performed on data that exists in a vacuum — it must be viewed on the backdrop of the tangibles behind it.
  • Motive: the motive for prescriptive analysis is to establish, with an acceptable degree of certainty, what results we can expect given a certain action. As you might expect, this necessitates that the analyst or researcher be aware of the world behind the data, not just the data itself.

Clustering Method

  • Description: the clustering method groups data points together based on their relativeness closeness to further explore and treat them based on these groupings. There are two ways to group clusters: intuitively and statistically (or K-means).
  • Importance: Very high. Though most corporate roles group clusters intuitively based on management criteria, a solid understanding of how to group them mathematically is an excellent descriptive and diagnostic approach to allow for prescriptive analysis thereafter.
  • Nature of Data : the nature of data useful for clustering is sets with 1 or more data fields. While most people are used to looking at only two dimensions (x and y), clustering becomes more accurate the more fields there are.
  • Motive: the motive for clustering is to understand how data sets group and to explore them further based on those groups.
  • Here’s an example set:

type of research analysis

Classification Method

  • Description: the classification method aims to separate and group data points based on common characteristics . This can be done intuitively or statistically.
  • Importance: High. While simple on the surface, classification can become quite complex. It’s very valuable in corporate and research environments, but can feel like its not worth the work. A good analyst can execute it quickly to deliver results.
  • Nature of Data: the nature of data useful for classification is data sets. As we will see, it can be used on qualitative data as well as quantitative. This method requires knowledge of the substance behind the data, not just the numbers themselves.
  • Motive: the motive for classification is group data not based on mathematical relationships (which would be clustering), but by predetermined outputs. This is why it’s less useful for diagnostic analysis, and more useful for prescriptive analysis.

Forecasting Method

  • Description: the forecasting method uses time past series data to forecast the future.
  • Importance: Very high. Forecasting falls under predictive analysis and is arguably the most common and most important method in the corporate world. It is less useful in research, which prefers to understand the known rather than speculate about the future.
  • Nature of Data: data useful for forecasting is time series data, which, as we’ve noted, always includes a variable of time.
  • Motive: the motive for the forecasting method is the same as that of prescriptive analysis: the confidently estimate future values.

Optimization Method

  • Description: the optimization method maximized or minimizes values in a set given a set of criteria. It is arguably most common in prescriptive analysis. In mathematical terms, it is maximizing or minimizing a function given certain constraints.
  • Importance: Very high. The idea of optimization applies to more analysis types than any other method. In fact, some argue that it is the fundamental driver behind data analysis. You would use it everywhere in research and in a corporation.
  • Nature of Data: the nature of optimizable data is a data set of at least two points.
  • Motive: the motive behind optimization is to achieve the best result possible given certain conditions.

Content Analysis Method

  • Description: content analysis is a method of qualitative analysis that quantifies textual data to track themes across a document. It’s most common in academic fields and in social sciences, where written content is the subject of inquiry.
  • Importance: High. In a corporate setting, content analysis as such is less common. If anything Nïave Bayes (a technique we’ll look at below) is the closest corporations come to text. However, it is of the utmost importance for researchers. If you’re a researcher, check out this article on content analysis .
  • Nature of Data: data useful for content analysis is textual data.
  • Motive: the motive behind content analysis is to understand themes expressed in a large text

Narrative Analysis Method

  • Description: narrative analysis is a method of qualitative analysis that quantifies stories to trace themes in them. It’s differs from content analysis because it focuses on stories rather than research documents, and the techniques used are slightly different from those in content analysis (very nuances and outside the scope of this article).
  • Importance: Low. Unless you are highly specialized in working with stories, narrative analysis rare.
  • Nature of Data: the nature of the data useful for the narrative analysis method is narrative text.
  • Motive: the motive for narrative analysis is to uncover hidden patterns in narrative text.

Discourse Analysis Method

  • Description: the discourse analysis method falls under qualitative analysis and uses thematic coding to trace patterns in real-life discourse. That said, real-life discourse is oral, so it must first be transcribed into text.
  • Importance: Low. Unless you are focused on understand real-world idea sharing in a research setting, this kind of analysis is less common than the others on this list.
  • Nature of Data: the nature of data useful in discourse analysis is first audio files, then transcriptions of those audio files.
  • Motive: the motive behind discourse analysis is to trace patterns of real-world discussions. (As a spooky sidenote, have you ever felt like your phone microphone was listening to you and making reading suggestions? If it was, the method was discourse analysis.)

Framework Analysis Method

  • Description: the framework analysis method falls under qualitative analysis and uses similar thematic coding techniques to content analysis. However, where content analysis aims to discover themes, framework analysis starts with a framework and only considers elements that fall in its purview.
  • Importance: Low. As with the other textual analysis methods, framework analysis is less common in corporate settings. Even in the world of research, only some use it. Strangely, it’s very common for legislative and political research.
  • Nature of Data: the nature of data useful for framework analysis is textual.
  • Motive: the motive behind framework analysis is to understand what themes and parts of a text match your search criteria.

Grounded Theory Method

  • Description: the grounded theory method falls under qualitative analysis and uses thematic coding to build theories around those themes.
  • Importance: Low. Like other qualitative analysis techniques, grounded theory is less common in the corporate world. Even among researchers, you would be hard pressed to find many using it. Though powerful, it’s simply too rare to spend time learning.
  • Nature of Data: the nature of data useful in the grounded theory method is textual.
  • Motive: the motive of grounded theory method is to establish a series of theories based on themes uncovered from a text.

Clustering Technique: K-Means

  • Description: k-means is a clustering technique in which data points are grouped in clusters that have the closest means. Though not considered AI or ML, it inherently requires the use of supervised learning to reevaluate clusters as data points are added. Clustering techniques can be used in diagnostic, descriptive, & prescriptive data analyses.
  • Importance: Very important. If you only take 3 things from this article, k-means clustering should be part of it. It is useful in any situation where n observations have multiple characteristics and we want to put them in groups.
  • Nature of Data: the nature of data is at least one characteristic per observation, but the more the merrier.
  • Motive: the motive for clustering techniques such as k-means is to group observations together and either understand or react to them.

Regression Technique

  • Description: simple and multivariable regressions use either one independent variable or combination of multiple independent variables to calculate a correlation to a single dependent variable using constants. Regressions are almost synonymous with correlation today.
  • Importance: Very high. Along with clustering, if you only take 3 things from this article, regression techniques should be part of it. They’re everywhere in corporate and research fields alike.
  • Nature of Data: the nature of data used is regressions is data sets with “n” number of observations and as many variables as are reasonable. It’s important, however, to distinguish between time series data and regression data. You cannot use regressions or time series data without accounting for time. The easier way is to use techniques under the forecasting method.
  • Motive: The motive behind regression techniques is to understand correlations between independent variable(s) and a dependent one.

Nïave Bayes Technique

  • Description: Nïave Bayes is a classification technique that uses simple probability to classify items based previous classifications. In plain English, the formula would be “the chance that thing with trait x belongs to class c depends on (=) the overall chance of trait x belonging to class c, multiplied by the overall chance of class c, divided by the overall chance of getting trait x.” As a formula, it’s P(c|x) = P(x|c) * P(c) / P(x).
  • Importance: High. Nïave Bayes is a very common, simplistic classification techniques because it’s effective with large data sets and it can be applied to any instant in which there is a class. Google, for example, might use it to group webpages into groups for certain search engine queries.
  • Nature of Data: the nature of data for Nïave Bayes is at least one class and at least two traits in a data set.
  • Motive: the motive behind Nïave Bayes is to classify observations based on previous data. It’s thus considered part of predictive analysis.

Cohorts Technique

  • Description: cohorts technique is a type of clustering method used in behavioral sciences to separate users by common traits. As with clustering, it can be done intuitively or mathematically, the latter of which would simply be k-means.
  • Importance: Very high. With regard to resembles k-means, the cohort technique is more of a high-level counterpart. In fact, most people are familiar with it as a part of Google Analytics. It’s most common in marketing departments in corporations, rather than in research.
  • Nature of Data: the nature of cohort data is data sets in which users are the observation and other fields are used as defining traits for each cohort.
  • Motive: the motive for cohort analysis techniques is to group similar users and analyze how you retain them and how the churn.

Factor Technique

  • Description: the factor analysis technique is a way of grouping many traits into a single factor to expedite analysis. For example, factors can be used as traits for Nïave Bayes classifications instead of more general fields.
  • Importance: High. While not commonly employed in corporations, factor analysis is hugely valuable. Good data analysts use it to simplify their projects and communicate them more clearly.
  • Nature of Data: the nature of data useful in factor analysis techniques is data sets with a large number of fields on its observations.
  • Motive: the motive for using factor analysis techniques is to reduce the number of fields in order to more quickly analyze and communicate findings.

Linear Discriminants Technique

  • Description: linear discriminant analysis techniques are similar to regressions in that they use one or more independent variable to determine a dependent variable; however, the linear discriminant technique falls under a classifier method since it uses traits as independent variables and class as a dependent variable. In this way, it becomes a classifying method AND a predictive method.
  • Importance: High. Though the analyst world speaks of and uses linear discriminants less commonly, it’s a highly valuable technique to keep in mind as you progress in data analysis.
  • Nature of Data: the nature of data useful for the linear discriminant technique is data sets with many fields.
  • Motive: the motive for using linear discriminants is to classify observations that would be otherwise too complex for simple techniques like Nïave Bayes.

Exponential Smoothing Technique

  • Description: exponential smoothing is a technique falling under the forecasting method that uses a smoothing factor on prior data in order to predict future values. It can be linear or adjusted for seasonality. The basic principle behind exponential smoothing is to use a percent weight (value between 0 and 1 called alpha) on more recent values in a series and a smaller percent weight on less recent values. The formula is f(x) = current period value * alpha + previous period value * 1-alpha.
  • Importance: High. Most analysts still use the moving average technique (covered next) for forecasting, though it is less efficient than exponential moving, because it’s easy to understand. However, good analysts will have exponential smoothing techniques in their pocket to increase the value of their forecasts.
  • Nature of Data: the nature of data useful for exponential smoothing is time series data . Time series data has time as part of its fields .
  • Motive: the motive for exponential smoothing is to forecast future values with a smoothing variable.

Moving Average Technique

  • Description: the moving average technique falls under the forecasting method and uses an average of recent values to predict future ones. For example, to predict rainfall in April, you would take the average of rainfall from January to March. It’s simple, yet highly effective.
  • Importance: Very high. While I’m personally not a huge fan of moving averages due to their simplistic nature and lack of consideration for seasonality, they’re the most common forecasting technique and therefore very important.
  • Nature of Data: the nature of data useful for moving averages is time series data .
  • Motive: the motive for moving averages is to predict future values is a simple, easy-to-communicate way.

Neural Networks Technique

  • Description: neural networks are a highly complex artificial intelligence technique that replicate a human’s neural analysis through a series of hyper-rapid computations and comparisons that evolve in real time. This technique is so complex that an analyst must use computer programs to perform it.
  • Importance: Medium. While the potential for neural networks is theoretically unlimited, it’s still little understood and therefore uncommon. You do not need to know it by any means in order to be a data analyst.
  • Nature of Data: the nature of data useful for neural networks is data sets of astronomical size, meaning with 100s of 1000s of fields and the same number of row at a minimum .
  • Motive: the motive for neural networks is to understand wildly complex phenomenon and data to thereafter act on it.

Decision Tree Technique

  • Description: the decision tree technique uses artificial intelligence algorithms to rapidly calculate possible decision pathways and their outcomes on a real-time basis. It’s so complex that computer programs are needed to perform it.
  • Importance: Medium. As with neural networks, decision trees with AI are too little understood and are therefore uncommon in corporate and research settings alike.
  • Nature of Data: the nature of data useful for the decision tree technique is hierarchical data sets that show multiple optional fields for each preceding field.
  • Motive: the motive for decision tree techniques is to compute the optimal choices to make in order to achieve a desired result.

Evolutionary Programming Technique

  • Description: the evolutionary programming technique uses a series of neural networks, sees how well each one fits a desired outcome, and selects only the best to test and retest. It’s called evolutionary because is resembles the process of natural selection by weeding out weaker options.
  • Importance: Medium. As with the other AI techniques, evolutionary programming just isn’t well-understood enough to be usable in many cases. It’s complexity also makes it hard to explain in corporate settings and difficult to defend in research settings.
  • Nature of Data: the nature of data in evolutionary programming is data sets of neural networks, or data sets of data sets.
  • Motive: the motive for using evolutionary programming is similar to decision trees: understanding the best possible option from complex data.
  • Video example :

Fuzzy Logic Technique

  • Description: fuzzy logic is a type of computing based on “approximate truths” rather than simple truths such as “true” and “false.” It is essentially two tiers of classification. For example, to say whether “Apples are good,” you need to first classify that “Good is x, y, z.” Only then can you say apples are good. Another way to see it helping a computer see truth like humans do: “definitely true, probably true, maybe true, probably false, definitely false.”
  • Importance: Medium. Like the other AI techniques, fuzzy logic is uncommon in both research and corporate settings, which means it’s less important in today’s world.
  • Nature of Data: the nature of fuzzy logic data is huge data tables that include other huge data tables with a hierarchy including multiple subfields for each preceding field.
  • Motive: the motive of fuzzy logic to replicate human truth valuations in a computer is to model human decisions based on past data. The obvious possible application is marketing.

Text Analysis Technique

  • Description: text analysis techniques fall under the qualitative data analysis type and use text to extract insights.
  • Importance: Medium. Text analysis techniques, like all the qualitative analysis type, are most valuable for researchers.
  • Nature of Data: the nature of data useful in text analysis is words.
  • Motive: the motive for text analysis is to trace themes in a text across sets of very long documents, such as books.

Coding Technique

  • Description: the coding technique is used in textual analysis to turn ideas into uniform phrases and analyze the number of times and the ways in which those ideas appear. For this reason, some consider it a quantitative technique as well. You can learn more about coding and the other qualitative techniques here .
  • Importance: Very high. If you’re a researcher working in social sciences, coding is THE analysis techniques, and for good reason. It’s a great way to add rigor to analysis. That said, it’s less common in corporate settings.
  • Nature of Data: the nature of data useful for coding is long text documents.
  • Motive: the motive for coding is to make tracing ideas on paper more than an exercise of the mind by quantifying it and understanding is through descriptive methods.

Idea Pattern Technique

  • Description: the idea pattern analysis technique fits into coding as the second step of the process. Once themes and ideas are coded, simple descriptive analysis tests may be run. Some people even cluster the ideas!
  • Importance: Very high. If you’re a researcher, idea pattern analysis is as important as the coding itself.
  • Nature of Data: the nature of data useful for idea pattern analysis is already coded themes.
  • Motive: the motive for the idea pattern technique is to trace ideas in otherwise unmanageably-large documents.

Word Frequency Technique

  • Description: word frequency is a qualitative technique that stands in opposition to coding and uses an inductive approach to locate specific words in a document in order to understand its relevance. Word frequency is essentially the descriptive analysis of qualitative data because it uses stats like mean, median, and mode to gather insights.
  • Importance: High. As with the other qualitative approaches, word frequency is very important in social science research, but less so in corporate settings.
  • Nature of Data: the nature of data useful for word frequency is long, informative documents.
  • Motive: the motive for word frequency is to locate target words to determine the relevance of a document in question.

Types of data analysis in research

Types of data analysis in research methodology include every item discussed in this article. As a list, they are:

  • Quantitative
  • Qualitative
  • Mathematical
  • Machine Learning and AI
  • Descriptive
  • Prescriptive
  • Classification
  • Forecasting
  • Optimization
  • Grounded theory
  • Artificial Neural Networks
  • Decision Trees
  • Evolutionary Programming
  • Fuzzy Logic
  • Text analysis
  • Idea Pattern Analysis
  • Word Frequency Analysis
  • Nïave Bayes
  • Exponential smoothing
  • Moving average
  • Linear discriminant

Types of data analysis in qualitative research

As a list, the types of data analysis in qualitative research are the following methods:

Types of data analysis in quantitative research

As a list, the types of data analysis in quantitative research are:

Data analysis methods

As a list, data analysis methods are:

  • Content (qualitative)
  • Narrative (qualitative)
  • Discourse (qualitative)
  • Framework (qualitative)
  • Grounded theory (qualitative)

Quantitative data analysis methods

As a list, quantitative data analysis methods are:

Tabular View of Data Analysis Types, Methods, and Techniques

About the author.

Noah is the founder & Editor-in-Chief at AnalystAnswers. He is a transatlantic professional and entrepreneur with 5+ years of corporate finance and data analytics experience, as well as 3+ years in consumer financial products and business software. He started AnalystAnswers to provide aspiring professionals with accessible explanations of otherwise dense finance and data concepts. Noah believes everyone can benefit from an analytical mindset in growing digital world. When he's not busy at work, Noah likes to explore new European cities, exercise, and spend time with friends and family.

File available immediately.

type of research analysis

Notice: JavaScript is required for this content.

type of research analysis

  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • QuestionPro

survey software icon

  • Solutions Industries Gaming Automotive Sports and events Education Government Travel & Hospitality Financial Services Healthcare Cannabis Technology Use Case NPS+ Communities Audience Contactless surveys Mobile LivePolls Member Experience GDPR Positive People Science 360 Feedback Surveys
  • Resources Blog eBooks Survey Templates Case Studies Training Help center

type of research analysis

Home Market Research

Data Analysis in Research: Types & Methods

data-analysis-in-research

Content Index

Why analyze data in research?

Types of data in research, finding patterns in the qualitative data, methods used for data analysis in qualitative research, preparing data for analysis, methods used for data analysis in quantitative research, considerations in research data analysis, what is data analysis in research.

Definition of research in data analysis: According to LeCompte and Schensul, research data analysis is a process used by researchers to reduce data to a story and interpret it to derive insights. The data analysis process helps reduce a large chunk of data into smaller fragments, which makes sense. 

Three essential things occur during the data analysis process — the first is data organization . Summarization and categorization together contribute to becoming the second known method used for data reduction. It helps find patterns and themes in the data for easy identification and linking. The third and last way is data analysis – researchers do it in both top-down and bottom-up fashion.

LEARN ABOUT: Research Process Steps

On the other hand, Marshall and Rossman describe data analysis as a messy, ambiguous, and time-consuming but creative and fascinating process through which a mass of collected data is brought to order, structure and meaning.

We can say that “the data analysis and data interpretation is a process representing the application of deductive and inductive logic to the research and data analysis.”

Researchers rely heavily on data as they have a story to tell or research problems to solve. It starts with a question, and data is nothing but an answer to that question. But, what if there is no question to ask? Well! It is possible to explore data even without a problem – we call it ‘Data Mining’, which often reveals some interesting patterns within the data that are worth exploring.

Irrelevant to the type of data researchers explore, their mission and audiences’ vision guide them to find the patterns to shape the story they want to tell. One of the essential things expected from researchers while analyzing data is to stay open and remain unbiased toward unexpected patterns, expressions, and results. Remember, sometimes, data analysis tells the most unforeseen yet exciting stories that were not expected when initiating data analysis. Therefore, rely on the data you have at hand and enjoy the journey of exploratory research. 

Create a Free Account

Every kind of data has a rare quality of describing things after assigning a specific value to it. For analysis, you need to organize these values, processed and presented in a given context, to make it useful. Data can be in different forms; here are the primary data types.

  • Qualitative data: When the data presented has words and descriptions, then we call it qualitative data . Although you can observe this data, it is subjective and harder to analyze data in research, especially for comparison. Example: Quality data represents everything describing taste, experience, texture, or an opinion that is considered quality data. This type of data is usually collected through focus groups, personal qualitative interviews , qualitative observation or using open-ended questions in surveys.
  • Quantitative data: Any data expressed in numbers of numerical figures are called quantitative data . This type of data can be distinguished into categories, grouped, measured, calculated, or ranked. Example: questions such as age, rank, cost, length, weight, scores, etc. everything comes under this type of data. You can present such data in graphical format, charts, or apply statistical analysis methods to this data. The (Outcomes Measurement Systems) OMS questionnaires in surveys are a significant source of collecting numeric data.
  • Categorical data: It is data presented in groups. However, an item included in the categorical data cannot belong to more than one group. Example: A person responding to a survey by telling his living style, marital status, smoking habit, or drinking habit comes under the categorical data. A chi-square test is a standard method used to analyze this data.

Learn More : Examples of Qualitative Data in Education

Data analysis in qualitative research

Data analysis and qualitative data research work a little differently from the numerical data as the quality data is made up of words, descriptions, images, objects, and sometimes symbols. Getting insight from such complicated information is a complicated process. Hence it is typically used for exploratory research and data analysis .

Although there are several ways to find patterns in the textual information, a word-based method is the most relied and widely used global technique for research and data analysis. Notably, the data analysis process in qualitative research is manual. Here the researchers usually read the available data and find repetitive or commonly used words. 

For example, while studying data collected from African countries to understand the most pressing issues people face, researchers might find  “food”  and  “hunger” are the most commonly used words and will highlight them for further analysis.

LEARN ABOUT: Level of Analysis

The keyword context is another widely used word-based technique. In this method, the researcher tries to understand the concept by analyzing the context in which the participants use a particular keyword.  

For example , researchers conducting research and data analysis for studying the concept of ‘diabetes’ amongst respondents might analyze the context of when and how the respondent has used or referred to the word ‘diabetes.’

The scrutiny-based technique is also one of the highly recommended  text analysis  methods used to identify a quality data pattern. Compare and contrast is the widely used method under this technique to differentiate how a specific text is similar or different from each other. 

For example: To find out the “importance of resident doctor in a company,” the collected data is divided into people who think it is necessary to hire a resident doctor and those who think it is unnecessary. Compare and contrast is the best method that can be used to analyze the polls having single-answer questions types .

Metaphors can be used to reduce the data pile and find patterns in it so that it becomes easier to connect data with theory.

Variable Partitioning is another technique used to split variables so that researchers can find more coherent descriptions and explanations from the enormous data.

LEARN ABOUT: Qualitative Research Questions and Questionnaires

There are several techniques to analyze the data in qualitative research, but here are some commonly used methods,

  • Content Analysis:  It is widely accepted and the most frequently employed technique for data analysis in research methodology. It can be used to analyze the documented information from text, images, and sometimes from the physical items. It depends on the research questions to predict when and where to use this method.
  • Narrative Analysis: This method is used to analyze content gathered from various sources such as personal interviews, field observation, and  surveys . The majority of times, stories, or opinions shared by people are focused on finding answers to the research questions.
  • Discourse Analysis:  Similar to narrative analysis, discourse analysis is used to analyze the interactions with people. Nevertheless, this particular method considers the social context under which or within which the communication between the researcher and respondent takes place. In addition to that, discourse analysis also focuses on the lifestyle and day-to-day environment while deriving any conclusion.
  • Grounded Theory:  When you want to explain why a particular phenomenon happened, then using grounded theory for analyzing quality data is the best resort. Grounded theory is applied to study data about the host of similar cases occurring in different settings. When researchers are using this method, they might alter explanations or produce new ones until they arrive at some conclusion.

LEARN ABOUT: 12 Best Tools for Researchers

Data analysis in quantitative research

The first stage in research and data analysis is to make it for the analysis so that the nominal data can be converted into something meaningful. Data preparation consists of the below phases.

Phase I: Data Validation

Data validation is done to understand if the collected data sample is per the pre-set standards, or it is a biased data sample again divided into four different stages

  • Fraud: To ensure an actual human being records each response to the survey or the questionnaire
  • Screening: To make sure each participant or respondent is selected or chosen in compliance with the research criteria
  • Procedure: To ensure ethical standards were maintained while collecting the data sample
  • Completeness: To ensure that the respondent has answered all the questions in an online survey. Else, the interviewer had asked all the questions devised in the questionnaire.

Phase II: Data Editing

More often, an extensive research data sample comes loaded with errors. Respondents sometimes fill in some fields incorrectly or sometimes skip them accidentally. Data editing is a process wherein the researchers have to confirm that the provided data is free of such errors. They need to conduct necessary checks and outlier checks to edit the raw edit and make it ready for analysis.

Phase III: Data Coding

Out of all three, this is the most critical phase of data preparation associated with grouping and assigning values to the survey responses . If a survey is completed with a 1000 sample size, the researcher will create an age bracket to distinguish the respondents based on their age. Thus, it becomes easier to analyze small data buckets rather than deal with the massive data pile.

LEARN ABOUT: Steps in Qualitative Research

After the data is prepared for analysis, researchers are open to using different research and data analysis methods to derive meaningful insights. For sure, statistical analysis plans are the most favored to analyze numerical data. In statistical analysis, distinguishing between categorical data and numerical data is essential, as categorical data involves distinct categories or labels, while numerical data consists of measurable quantities. The method is again classified into two groups. First, ‘Descriptive Statistics’ used to describe data. Second, ‘Inferential statistics’ that helps in comparing the data .

Descriptive statistics

This method is used to describe the basic features of versatile types of data in research. It presents the data in such a meaningful way that pattern in the data starts making sense. Nevertheless, the descriptive analysis does not go beyond making conclusions. The conclusions are again based on the hypothesis researchers have formulated so far. Here are a few major types of descriptive analysis methods.

Measures of Frequency

  • Count, Percent, Frequency
  • It is used to denote home often a particular event occurs.
  • Researchers use it when they want to showcase how often a response is given.

Measures of Central Tendency

  • Mean, Median, Mode
  • The method is widely used to demonstrate distribution by various points.
  • Researchers use this method when they want to showcase the most commonly or averagely indicated response.

Measures of Dispersion or Variation

  • Range, Variance, Standard deviation
  • Here the field equals high/low points.
  • Variance standard deviation = difference between the observed score and mean
  • It is used to identify the spread of scores by stating intervals.
  • Researchers use this method to showcase data spread out. It helps them identify the depth until which the data is spread out that it directly affects the mean.

Measures of Position

  • Percentile ranks, Quartile ranks
  • It relies on standardized scores helping researchers to identify the relationship between different scores.
  • It is often used when researchers want to compare scores with the average count.

For quantitative research use of descriptive analysis often give absolute numbers, but the in-depth analysis is never sufficient to demonstrate the rationale behind those numbers. Nevertheless, it is necessary to think of the best method for research and data analysis suiting your survey questionnaire and what story researchers want to tell. For example, the mean is the best way to demonstrate the students’ average scores in schools. It is better to rely on the descriptive statistics when the researchers intend to keep the research or outcome limited to the provided  sample  without generalizing it. For example, when you want to compare average voting done in two different cities, differential statistics are enough.

Descriptive analysis is also called a ‘univariate analysis’ since it is commonly used to analyze a single variable.

Inferential statistics

Inferential statistics are used to make predictions about a larger population after research and data analysis of the representing population’s collected sample. For example, you can ask some odd 100 audiences at a movie theater if they like the movie they are watching. Researchers then use inferential statistics on the collected  sample  to reason that about 80-90% of people like the movie. 

Here are two significant areas of inferential statistics.

  • Estimating parameters: It takes statistics from the sample research data and demonstrates something about the population parameter.
  • Hypothesis test: I t’s about sampling research data to answer the survey research questions. For example, researchers might be interested to understand if the new shade of lipstick recently launched is good or not, or if the multivitamin capsules help children to perform better at games.

These are sophisticated analysis methods used to showcase the relationship between different variables instead of describing a single variable. It is often used when researchers want something beyond absolute numbers to understand the relationship between variables.

Here are some of the commonly used methods for data analysis in research.

  • Correlation: When researchers are not conducting experimental research or quasi-experimental research wherein the researchers are interested to understand the relationship between two or more variables, they opt for correlational research methods.
  • Cross-tabulation: Also called contingency tables,  cross-tabulation  is used to analyze the relationship between multiple variables.  Suppose provided data has age and gender categories presented in rows and columns. A two-dimensional cross-tabulation helps for seamless data analysis and research by showing the number of males and females in each age category.
  • Regression analysis: For understanding the strong relationship between two variables, researchers do not look beyond the primary and commonly used regression analysis method, which is also a type of predictive analysis used. In this method, you have an essential factor called the dependent variable. You also have multiple independent variables in regression analysis. You undertake efforts to find out the impact of independent variables on the dependent variable. The values of both independent and dependent variables are assumed as being ascertained in an error-free random manner.
  • Frequency tables: The statistical procedure is used for testing the degree to which two or more vary or differ in an experiment. A considerable degree of variation means research findings were significant. In many contexts, ANOVA testing and variance analysis are similar.
  • Analysis of variance: The statistical procedure is used for testing the degree to which two or more vary or differ in an experiment. A considerable degree of variation means research findings were significant. In many contexts, ANOVA testing and variance analysis are similar.
  • Researchers must have the necessary research skills to analyze and manipulation the data , Getting trained to demonstrate a high standard of research practice. Ideally, researchers must possess more than a basic understanding of the rationale of selecting one statistical method over the other to obtain better data insights.
  • Usually, research and data analytics projects differ by scientific discipline; therefore, getting statistical advice at the beginning of analysis helps design a survey questionnaire, select data collection methods , and choose samples.

LEARN ABOUT: Best Data Collection Tools

  • The primary aim of data research and analysis is to derive ultimate insights that are unbiased. Any mistake in or keeping a biased mind to collect data, selecting an analysis method, or choosing  audience  sample il to draw a biased inference.
  • Irrelevant to the sophistication used in research data and analysis is enough to rectify the poorly defined objective outcome measurements. It does not matter if the design is at fault or intentions are not clear, but lack of clarity might mislead readers, so avoid the practice.
  • The motive behind data analysis in research is to present accurate and reliable data. As far as possible, avoid statistical errors, and find a way to deal with everyday challenges like outliers, missing data, data altering, data mining , or developing graphical representation.

LEARN MORE: Descriptive Research vs Correlational Research The sheer amount of data generated daily is frightening. Especially when data analysis has taken center stage. in 2018. In last year, the total data supply amounted to 2.8 trillion gigabytes. Hence, it is clear that the enterprises willing to survive in the hypercompetitive world must possess an excellent capability to analyze complex research data, derive actionable insights, and adapt to the new market needs.

LEARN ABOUT: Average Order Value

QuestionPro is an online survey platform that empowers organizations in data analysis and research and provides them a medium to collect data by creating appealing surveys.

MORE LIKE THIS

data information vs insight

Data Information vs Insight: Essential differences

May 14, 2024

pricing analytics software

Pricing Analytics Software: Optimize Your Pricing Strategy

May 13, 2024

relationship marketing

Relationship Marketing: What It Is, Examples & Top 7 Benefits

May 8, 2024

email survey tool

The Best Email Survey Tool to Boost Your Feedback Game

May 7, 2024

Other categories

  • Academic Research
  • Artificial Intelligence
  • Assessments
  • Brand Awareness
  • Case Studies
  • Communities
  • Consumer Insights
  • Customer effort score
  • Customer Engagement
  • Customer Experience
  • Customer Loyalty
  • Customer Research
  • Customer Satisfaction
  • Employee Benefits
  • Employee Engagement
  • Employee Retention
  • Friday Five
  • General Data Protection Regulation
  • Insights Hub
  • Life@QuestionPro
  • Market Research
  • Mobile diaries
  • Mobile Surveys
  • New Features
  • Online Communities
  • Question Types
  • Questionnaire
  • QuestionPro Products
  • Release Notes
  • Research Tools and Apps
  • Revenue at Risk
  • Survey Templates
  • Training Tips
  • Uncategorized
  • Video Learning Series
  • What’s Coming Up
  • Workforce Intelligence

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, automatically generate references for free.

  • Knowledge Base
  • Methodology

Research Methods | Definition, Types, Examples

Research methods are specific procedures for collecting and analysing data. Developing your research methods is an integral part of your research design . When planning your methods, there are two key decisions you will make.

First, decide how you will collect data . Your methods depend on what type of data you need to answer your research question :

  • Qualitative vs quantitative : Will your data take the form of words or numbers?
  • Primary vs secondary : Will you collect original data yourself, or will you use data that have already been collected by someone else?
  • Descriptive vs experimental : Will you take measurements of something as it is, or will you perform an experiment?

Second, decide how you will analyse the data .

  • For quantitative data, you can use statistical analysis methods to test relationships between variables.
  • For qualitative data, you can use methods such as thematic analysis to interpret patterns and meanings in the data.

Table of contents

Methods for collecting data, examples of data collection methods, methods for analysing data, examples of data analysis methods, frequently asked questions about methodology.

Data are the information that you collect for the purposes of answering your research question . The type of data you need depends on the aims of your research.

Qualitative vs quantitative data

Your choice of qualitative or quantitative data collection depends on the type of knowledge you want to develop.

For questions about ideas, experiences and meanings, or to study something that can’t be described numerically, collect qualitative data .

If you want to develop a more mechanistic understanding of a topic, or your research involves hypothesis testing , collect quantitative data .

You can also take a mixed methods approach, where you use both qualitative and quantitative research methods.

Primary vs secondary data

Primary data are any original information that you collect for the purposes of answering your research question (e.g. through surveys , observations and experiments ). Secondary data are information that has already been collected by other researchers (e.g. in a government census or previous scientific studies).

If you are exploring a novel research question, you’ll probably need to collect primary data. But if you want to synthesise existing knowledge, analyse historical trends, or identify patterns on a large scale, secondary data might be a better choice.

Descriptive vs experimental data

In descriptive research , you collect data about your study subject without intervening. The validity of your research will depend on your sampling method .

In experimental research , you systematically intervene in a process and measure the outcome. The validity of your research will depend on your experimental design .

To conduct an experiment, you need to be able to vary your independent variable , precisely measure your dependent variable, and control for confounding variables . If it’s practically and ethically possible, this method is the best choice for answering questions about cause and effect.

Prevent plagiarism, run a free check.

Your data analysis methods will depend on the type of data you collect and how you prepare them for analysis.

Data can often be analysed both quantitatively and qualitatively. For example, survey responses could be analysed qualitatively by studying the meanings of responses or quantitatively by studying the frequencies of responses.

Qualitative analysis methods

Qualitative analysis is used to understand words, ideas, and experiences. You can use it to interpret data that were collected:

  • From open-ended survey and interview questions, literature reviews, case studies, and other sources that use text rather than numbers.
  • Using non-probability sampling methods .

Qualitative analysis tends to be quite flexible and relies on the researcher’s judgement, so you have to reflect carefully on your choices and assumptions.

Quantitative analysis methods

Quantitative analysis uses numbers and statistics to understand frequencies, averages and correlations (in descriptive studies) or cause-and-effect relationships (in experiments).

You can use quantitative analysis to interpret data that were collected either:

  • During an experiment.
  • Using probability sampling methods .

Because the data are collected and analysed in a statistically valid way, the results of quantitative analysis can be easily standardised and shared among researchers.

Quantitative research deals with numbers and statistics, while qualitative research deals with words and meanings.

Quantitative methods allow you to test a hypothesis by systematically collecting and analysing data, while qualitative methods allow you to explore ideas and experiences in depth.

In mixed methods research , you use both qualitative and quantitative data collection and analysis methods to answer your research question .

A sample is a subset of individuals from a larger population. Sampling means selecting the group that you will actually collect data from in your research.

For example, if you are researching the opinions of students in your university, you could survey a sample of 100 students.

Statistical sampling allows you to test a hypothesis about the characteristics of a population. There are various sampling methods you can use to ensure that your sample is representative of the population as a whole.

The research methods you use depend on the type of data you need to answer your research question .

  • If you want to measure something or test a hypothesis , use quantitative methods . If you want to explore ideas, thoughts, and meanings, use qualitative methods .
  • If you want to analyse a large amount of readily available data, use secondary data. If you want data specific to your purposes with control over how they are generated, collect primary data.
  • If you want to establish cause-and-effect relationships between variables , use experimental methods. If you want to understand the characteristics of a research subject, use descriptive methods.

Methodology refers to the overarching strategy and rationale of your research project . It involves studying the methods used in your field and the theories or principles behind them, in order to develop an approach that matches your objectives.

Methods are the specific tools and procedures you use to collect and analyse data (e.g. experiments, surveys , and statistical tests ).

In shorter scientific papers, where the aim is to report the findings of a specific study, you might simply describe what you did in a methods section .

In a longer or more complex research project, such as a thesis or dissertation , you will probably include a methodology section , where you explain your approach to answering the research questions and cite relevant sources to support your choice of methods.

Is this article helpful?

More interesting articles.

  • A Quick Guide to Experimental Design | 5 Steps & Examples
  • Between-Subjects Design | Examples, Pros & Cons
  • Case Study | Definition, Examples & Methods
  • Cluster Sampling | A Simple Step-by-Step Guide with Examples
  • Confounding Variables | Definition, Examples & Controls
  • Construct Validity | Definition, Types, & Examples
  • Content Analysis | A Step-by-Step Guide with Examples
  • Control Groups and Treatment Groups | Uses & Examples
  • Controlled Experiments | Methods & Examples of Control
  • Correlation vs Causation | Differences, Designs & Examples
  • Correlational Research | Guide, Design & Examples
  • Critical Discourse Analysis | Definition, Guide & Examples
  • Cross-Sectional Study | Definitions, Uses & Examples
  • Data Cleaning | A Guide with Examples & Steps
  • Data Collection Methods | Step-by-Step Guide & Examples
  • Descriptive Research Design | Definition, Methods & Examples
  • Doing Survey Research | A Step-by-Step Guide & Examples
  • Ethical Considerations in Research | Types & Examples
  • Explanatory Research | Definition, Guide, & Examples
  • Explanatory vs Response Variables | Definitions & Examples
  • Exploratory Research | Definition, Guide, & Examples
  • External Validity | Types, Threats & Examples
  • Extraneous Variables | Examples, Types, Controls
  • Face Validity | Guide with Definition & Examples
  • How to Do Thematic Analysis | Guide & Examples
  • How to Write a Strong Hypothesis | Guide & Examples
  • Inclusion and Exclusion Criteria | Examples & Definition
  • Independent vs Dependent Variables | Definition & Examples
  • Inductive Reasoning | Types, Examples, Explanation
  • Inductive vs Deductive Research Approach (with Examples)
  • Internal Validity | Definition, Threats & Examples
  • Internal vs External Validity | Understanding Differences & Examples
  • Longitudinal Study | Definition, Approaches & Examples
  • Mediator vs Moderator Variables | Differences & Examples
  • Mixed Methods Research | Definition, Guide, & Examples
  • Multistage Sampling | An Introductory Guide with Examples
  • Naturalistic Observation | Definition, Guide & Examples
  • Operationalisation | A Guide with Examples, Pros & Cons
  • Population vs Sample | Definitions, Differences & Examples
  • Primary Research | Definition, Types, & Examples
  • Qualitative vs Quantitative Research | Examples & Methods
  • Quasi-Experimental Design | Definition, Types & Examples
  • Questionnaire Design | Methods, Question Types & Examples
  • Random Assignment in Experiments | Introduction & Examples
  • Reliability vs Validity in Research | Differences, Types & Examples
  • Reproducibility vs Replicability | Difference & Examples
  • Research Design | Step-by-Step Guide with Examples
  • Sampling Methods | Types, Techniques, & Examples
  • Semi-Structured Interview | Definition, Guide & Examples
  • Simple Random Sampling | Definition, Steps & Examples
  • Stratified Sampling | A Step-by-Step Guide with Examples
  • Structured Interview | Definition, Guide & Examples
  • Systematic Review | Definition, Examples & Guide
  • Systematic Sampling | A Step-by-Step Guide with Examples
  • Textual Analysis | Guide, 3 Approaches & Examples
  • The 4 Types of Reliability in Research | Definitions & Examples
  • The 4 Types of Validity | Types, Definitions & Examples
  • Transcribing an Interview | 5 Steps & Transcription Software
  • Triangulation in Research | Guide, Types, Examples
  • Types of Interviews in Research | Guide & Examples
  • Types of Research Designs Compared | Examples
  • Types of Variables in Research | Definitions & Examples
  • Unstructured Interview | Definition, Guide & Examples
  • What Are Control Variables | Definition & Examples
  • What Is a Case-Control Study? | Definition & Examples
  • What Is a Cohort Study? | Definition & Examples
  • What Is a Conceptual Framework? | Tips & Examples
  • What Is a Double-Barrelled Question?
  • What Is a Double-Blind Study? | Introduction & Examples
  • What Is a Focus Group? | Step-by-Step Guide & Examples
  • What Is a Likert Scale? | Guide & Examples
  • What is a Literature Review? | Guide, Template, & Examples
  • What Is a Prospective Cohort Study? | Definition & Examples
  • What Is a Retrospective Cohort Study? | Definition & Examples
  • What Is Action Research? | Definition & Examples
  • What Is an Observational Study? | Guide & Examples
  • What Is Concurrent Validity? | Definition & Examples
  • What Is Content Validity? | Definition & Examples
  • What Is Convenience Sampling? | Definition & Examples
  • What Is Convergent Validity? | Definition & Examples
  • What Is Criterion Validity? | Definition & Examples
  • What Is Deductive Reasoning? | Explanation & Examples
  • What Is Discriminant Validity? | Definition & Example
  • What Is Ecological Validity? | Definition & Examples
  • What Is Ethnography? | Meaning, Guide & Examples
  • What Is Non-Probability Sampling? | Types & Examples
  • What Is Participant Observation? | Definition & Examples
  • What Is Peer Review? | Types & Examples
  • What Is Predictive Validity? | Examples & Definition
  • What Is Probability Sampling? | Types & Examples
  • What Is Purposive Sampling? | Definition & Examples
  • What Is Qualitative Observation? | Definition & Examples
  • What Is Qualitative Research? | Methods & Examples
  • What Is Quantitative Observation? | Definition & Examples
  • What Is Quantitative Research? | Definition & Methods
  • What Is Quota Sampling? | Definition & Examples
  • What is Secondary Research? | Definition, Types, & Examples
  • What Is Snowball Sampling? | Definition & Examples
  • Within-Subjects Design | Explanation, Approaches, Examples

News alert: UC Berkeley has announced its next university librarian

Secondary menu

  • Log in to your Library account
  • Hours and Maps
  • Connect from Off Campus
  • UC Berkeley Home

Search form

Research methods--quantitative, qualitative, and more: overview.

  • Quantitative Research
  • Qualitative Research
  • Data Science Methods (Machine Learning, AI, Big Data)
  • Text Mining and Computational Text Analysis
  • Evidence Synthesis/Systematic Reviews
  • Get Data, Get Help!

About Research Methods

This guide provides an overview of research methods, how to choose and use them, and supports and resources at UC Berkeley. 

As Patten and Newhart note in the book Understanding Research Methods , "Research methods are the building blocks of the scientific enterprise. They are the "how" for building systematic knowledge. The accumulation of knowledge through research is by its nature a collective endeavor. Each well-designed study provides evidence that may support, amend, refute, or deepen the understanding of existing knowledge...Decisions are important throughout the practice of research and are designed to help researchers collect evidence that includes the full spectrum of the phenomenon under study, to maintain logical rules, and to mitigate or account for possible sources of bias. In many ways, learning research methods is learning how to see and make these decisions."

The choice of methods varies by discipline, by the kind of phenomenon being studied and the data being used to study it, by the technology available, and more.  This guide is an introduction, but if you don't see what you need here, always contact your subject librarian, and/or take a look to see if there's a library research guide that will answer your question. 

Suggestions for changes and additions to this guide are welcome! 

START HERE: SAGE Research Methods

Without question, the most comprehensive resource available from the library is SAGE Research Methods.  HERE IS THE ONLINE GUIDE  to this one-stop shopping collection, and some helpful links are below:

  • SAGE Research Methods
  • Little Green Books  (Quantitative Methods)
  • Little Blue Books  (Qualitative Methods)
  • Dictionaries and Encyclopedias  
  • Case studies of real research projects
  • Sample datasets for hands-on practice
  • Streaming video--see methods come to life
  • Methodspace- -a community for researchers
  • SAGE Research Methods Course Mapping

Library Data Services at UC Berkeley

Library Data Services Program and Digital Scholarship Services

The LDSP offers a variety of services and tools !  From this link, check out pages for each of the following topics:  discovering data, managing data, collecting data, GIS data, text data mining, publishing data, digital scholarship, open science, and the Research Data Management Program.

Be sure also to check out the visual guide to where to seek assistance on campus with any research question you may have!

Library GIS Services

Other Data Services at Berkeley

D-Lab Supports Berkeley faculty, staff, and graduate students with research in data intensive social science, including a wide range of training and workshop offerings Dryad Dryad is a simple self-service tool for researchers to use in publishing their datasets. It provides tools for the effective publication of and access to research data. Geospatial Innovation Facility (GIF) Provides leadership and training across a broad array of integrated mapping technologies on campu Research Data Management A UC Berkeley guide and consulting service for research data management issues

General Research Methods Resources

Here are some general resources for assistance:

  • Assistance from ICPSR (must create an account to access): Getting Help with Data , and Resources for Students
  • Wiley Stats Ref for background information on statistics topics
  • Survey Documentation and Analysis (SDA) .  Program for easy web-based analysis of survey data.

Consultants

  • D-Lab/Data Science Discovery Consultants Request help with your research project from peer consultants.
  • Research data (RDM) consulting Meet with RDM consultants before designing the data security, storage, and sharing aspects of your qualitative project.
  • Statistics Department Consulting Services A service in which advanced graduate students, under faculty supervision, are available to consult during specified hours in the Fall and Spring semesters.

Related Resourcex

  • IRB / CPHS Qualitative research projects with human subjects often require that you go through an ethics review.
  • OURS (Office of Undergraduate Research and Scholarships) OURS supports undergraduates who want to embark on research projects and assistantships. In particular, check out their "Getting Started in Research" workshops
  • Sponsored Projects Sponsored projects works with researchers applying for major external grants.
  • Next: Quantitative Research >>
  • Last Updated: Apr 25, 2024 11:09 AM
  • URL: https://guides.lib.berkeley.edu/researchmethods

8 Types of Data Analysis

type of research analysis

Data analysis is an aspect of  data science and data analytics that is all about analyzing data for different kinds of purposes. The data analysis process involves inspecting, cleaning, transforming and modeling data to draw useful insights from it.

What Are the Different Types of Data Analysis?

  • Descriptive analysis
  • Diagnostic analysis
  • Exploratory analysis
  • Inferential analysis
  • Predictive analysis
  • Causal analysis
  • Mechanistic analysis
  • Prescriptive analysis

With its multiple facets, methodologies and techniques, data analysis is used in a variety of fields, including business, science and social science, among others. As businesses thrive under the influence of technological advancements in data analytics, data analysis plays a huge role in  decision-making , providing a better, faster and more efficacious system that minimizes risks and reduces  human biases .

That said, there are different kinds of data analysis catered with different goals. We’ll examine each one below.

Two Camps of Data Analysis

Data analysis can be divided into two camps, according to the book  R for Data Science :

  • Hypothesis Generation — This involves looking deeply at the data and combining your domain knowledge to generate hypotheses about why the data behaves the way it does.
  • Hypothesis Confirmation — This involves using a precise mathematical model to generate falsifiable predictions with statistical sophistication to confirm your prior hypotheses.

Types of Data Analysis

Data analysis can be separated and organized into types, arranged in an increasing order of complexity.

1. Descriptive Analysis

The goal of descriptive analysis is to describe or summarize a set of data. Here’s what you need to know:

  • Descriptive analysis is the very first analysis performed in the data analysis process.
  • It generates simple summaries about samples and measurements.
  • It involves common, descriptive statistics like measures of central tendency, variability, frequency and position.

Descriptive Analysis Example

Take the  Covid-19 statistics page on Google, for example. The line graph is a pure summary of the cases/deaths, a presentation and description of the population of a particular country infected by the virus.

Descriptive analysis is the first step in analysis where you summarize and describe the data you have using descriptive statistics, and the result is a simple presentation of your data.

More on Data Analysis: Data Analyst vs. Data Scientist: Similarities and Differences Explained

2. Diagnostic Analysis 

Diagnostic analysis seeks to answer the question “Why did this happen?” by taking a more in-depth look at data to uncover subtle patterns. Here’s what you need to know:

  • Diagnostic analysis typically comes after descriptive analysis, taking initial findings and investigating why certain patterns in data happen. 
  • Diagnostic analysis may involve analyzing other related data sources, including past data, to reveal more insights into current data trends.  
  • Diagnostic analysis is ideal for further exploring patterns in data to explain anomalies.  

Diagnostic Analysis Example

A footwear store wants to review its website traffic levels over the previous 12 months. Upon compiling and assessing the data, the company’s marketing team finds that June experienced above-average levels of traffic while July and August witnessed slightly lower levels of traffic. 

To find out why this difference occurred, the marketing team takes a deeper look. Team members break down the data to focus on specific categories of footwear. In the month of June, they discovered that pages featuring sandals and other beach-related footwear received a high number of views while these numbers dropped in July and August. 

Marketers may also review other factors like seasonal changes and company sales events to see if other variables could have contributed to this trend.   

3. Exploratory Analysis (EDA)

Exploratory analysis involves examining or exploring data and finding relationships between variables that were previously unknown. Here’s what you need to know:

  • EDA helps you discover relationships between measures in your data, which are not evidence for the existence of the correlation, as denoted by the phrase, “ Correlation doesn’t imply causation .”
  • It’s useful for discovering new connections and forming hypotheses. It drives design planning and data collection.

Exploratory Analysis Example

Climate change is an increasingly important topic as the global temperature has gradually risen over the years. One example of an exploratory data analysis on climate change involves taking the rise in temperature over the years from 1950 to 2020 and the increase of human activities and industrialization to find relationships from the data. For example, you may increase the number of factories, cars on the road and airplane flights to see how that correlates with the rise in temperature.

Exploratory analysis explores data to find relationships between measures without identifying the cause. It’s most useful when formulating hypotheses.

4. Inferential Analysis

Inferential analysis involves using a small sample of data to infer information about a larger population of data.

The goal of statistical modeling itself is all about using a small amount of information to extrapolate and generalize information to a larger group. Here’s what you need to know:

  • Inferential analysis involves using estimated data that is representative of a population and gives a measure of uncertainty or standard deviation to your estimation.
  • The  accuracy of inference depends heavily on your sampling scheme. If the sample isn’t representative of the population, the generalization will be inaccurate. This is known as the  central limit theorem .

Inferential Analysis Example

The idea of drawing an inference about the population at large with a smaller sample size is intuitive. Many statistics you see on the media and the internet are inferential; a prediction of an event based on a small sample. For example, a psychological study on the benefits of sleep might have a total of 500 people involved. When they followed up with the candidates, the candidates reported to have better overall attention spans and well-being with seven-to-nine hours of sleep, while those with less sleep and more sleep than the given range suffered from reduced attention spans and energy. This study drawn from 500 people was just a tiny portion of the 7 billion people in the world, and is thus an inference of the larger population.

Inferential analysis extrapolates and generalizes the information of the larger group with a smaller sample to generate analysis and predictions.

5. Predictive Analysis

Predictive analysis involves using historical or current data to find patterns and make predictions about the future. Here’s what you need to know:

  • The accuracy of the predictions depends on the input variables.
  • Accuracy also depends on the types of models. A linear model might work well in some cases, and in other cases it might not.
  • Using a variable to predict another one doesn’t denote a causal relationship.

Predictive Analysis Example

The 2020 US election is a popular topic and many  prediction models are built to predict the winning candidate. FiveThirtyEight did this to forecast the 2016 and 2020 elections. Prediction analysis for an election would require input variables such as historical polling data, trends and current polling data in order to return a good prediction. Something as large as an election wouldn’t just be using a linear model, but a complex model with certain tunings to best serve its purpose.

Predictive analysis takes data from the past and present to make predictions about the future.

More on Data: Explaining the Empirical for Normal Distribution

6. Causal Analysis

Causal analysis looks at the cause and effect of relationships between variables and is focused on finding the cause of a correlation. Here’s what you need to know:

  • To find the cause, you have to question whether the observed correlations driving your conclusion are valid. Just looking at the surface data won’t help you discover the hidden mechanisms underlying the correlations.
  • Causal analysis is applied in randomized studies focused on identifying causation.
  • Causal analysis is the gold standard in data analysis and scientific studies where the cause of phenomenon is to be extracted and singled out, like separating wheat from chaff.
  • Good data is hard to find and requires expensive research and studies. These studies are analyzed in aggregate (multiple groups), and the observed relationships are just average effects (mean) of the whole population. This means the results might not apply to everyone.

Causal Analysis Example  

Say you want to test out whether a new drug improves human strength and focus. To do that, you perform randomized control trials for the drug to test its effect. You compare the sample of candidates for your new drug against the candidates receiving a mock control drug through a few tests focused on strength and overall focus and attention. This will allow you to observe how the drug affects the outcome.

Causal analysis is about finding out the causal relationship between variables, and examining how a change in one variable affects another.

7. Mechanistic Analysis

Mechanistic analysis is used to understand exact changes in variables that lead to other changes in other variables. Here’s what you need to know:

  • It’s applied in physical or engineering sciences, situations that require high precision and little room for error, only noise in data is measurement error.
  • It’s designed to understand a biological or behavioral process, the pathophysiology of a disease or the mechanism of action of an intervention. 

Mechanistic Analysis Example

Many graduate-level research and complex topics are suitable examples, but to put it in simple terms, let’s say an experiment is done to simulate safe and effective nuclear fusion to power the world. A mechanistic analysis of the study would entail a precise balance of controlling and manipulating variables with highly accurate measures of both variables and the desired outcomes. It’s this intricate and meticulous modus operandi toward these big topics that allows for scientific breakthroughs and advancement of society.

Mechanistic analysis is in some ways a predictive analysis, but modified to tackle studies that require high precision and meticulous methodologies for physical or engineering science .

8. Prescriptive Analysis 

Prescriptive analysis compiles insights from other previous data analyses and determines actions that teams or companies can take to prepare for predicted trends. Here’s what you need to know: 

  • Prescriptive analysis may come right after predictive analysis, but it may involve combining many different data analyses. 
  • Companies need advanced technology and plenty of resources to conduct prescriptive analysis. AI systems that process data and adjust automated tasks are an example of the technology required to perform prescriptive analysis.  

Prescriptive Analysis Example

Prescriptive analysis is pervasive in everyday life, driving the curated content users consume on social media. On platforms like TikTok and Instagram, algorithms can apply prescriptive analysis to review past content a user has engaged with and the kinds of behaviors they exhibited with specific posts. Based on these factors, an algorithm seeks out similar content that is likely to elicit the same response and recommends it on a user’s personal feed. 

When to Use the Different Types of Data Analysis 

  • Descriptive analysis summarizes the data at hand and presents your data in a comprehensible way.
  • Diagnostic analysis takes a more detailed look at data to reveal why certain patterns occur, making it a good method for explaining anomalies. 
  • Exploratory data analysis helps you discover correlations and relationships between variables in your data.
  • Inferential analysis is for generalizing the larger population with a smaller sample size of data.
  • Predictive analysis helps you make predictions about the future with data.
  • Causal analysis emphasizes finding the cause of a correlation between variables.
  • Mechanistic analysis is for measuring the exact changes in variables that lead to other changes in other variables.
  • Prescriptive analysis combines insights from different data analyses to develop a course of action teams and companies can take to capitalize on predicted outcomes. 

A few important tips to remember about data analysis include:

  • Correlation doesn’t imply causation.
  • EDA helps discover new connections and form hypotheses.
  • Accuracy of inference depends on the sampling scheme.
  • A good prediction depends on the right input variables.
  • A simple linear model with enough data usually does the trick.
  • Using a variable to predict another doesn’t denote causal relationships.
  • Good data is hard to find, and to produce it requires expensive research.
  • Results from studies are done in aggregate and are average effects and might not apply to everyone.​

Built In’s expert contributor network publishes thoughtful, solutions-oriented stories written by innovative tech professionals. It is the tech industry’s definitive destination for sharing compelling, first-person accounts of problem-solving on the road to innovation.

Great Companies Need Great People. That's Where We Come In.

Grad Coach

Qualitative Data Analysis Methods 101:

The “big 6” methods + examples.

By: Kerryn Warren (PhD) | Reviewed By: Eunice Rautenbach (D.Tech) | May 2020 (Updated April 2023)

Qualitative data analysis methods. Wow, that’s a mouthful. 

If you’re new to the world of research, qualitative data analysis can look rather intimidating. So much bulky terminology and so many abstract, fluffy concepts. It certainly can be a minefield!

Don’t worry – in this post, we’ll unpack the most popular analysis methods , one at a time, so that you can approach your analysis with confidence and competence – whether that’s for a dissertation, thesis or really any kind of research project.

Qualitative data analysis methods

What (exactly) is qualitative data analysis?

To understand qualitative data analysis, we need to first understand qualitative data – so let’s step back and ask the question, “what exactly is qualitative data?”.

Qualitative data refers to pretty much any data that’s “not numbers” . In other words, it’s not the stuff you measure using a fixed scale or complex equipment, nor do you analyse it using complex statistics or mathematics.

So, if it’s not numbers, what is it?

Words, you guessed? Well… sometimes , yes. Qualitative data can, and often does, take the form of interview transcripts, documents and open-ended survey responses – but it can also involve the interpretation of images and videos. In other words, qualitative isn’t just limited to text-based data.

So, how’s that different from quantitative data, you ask?

Simply put, qualitative research focuses on words, descriptions, concepts or ideas – while quantitative research focuses on numbers and statistics . Qualitative research investigates the “softer side” of things to explore and describe , while quantitative research focuses on the “hard numbers”, to measure differences between variables and the relationships between them. If you’re keen to learn more about the differences between qual and quant, we’ve got a detailed post over here .

qualitative data analysis vs quantitative data analysis

So, qualitative analysis is easier than quantitative, right?

Not quite. In many ways, qualitative data can be challenging and time-consuming to analyse and interpret. At the end of your data collection phase (which itself takes a lot of time), you’ll likely have many pages of text-based data or hours upon hours of audio to work through. You might also have subtle nuances of interactions or discussions that have danced around in your mind, or that you scribbled down in messy field notes. All of this needs to work its way into your analysis.

Making sense of all of this is no small task and you shouldn’t underestimate it. Long story short – qualitative analysis can be a lot of work! Of course, quantitative analysis is no piece of cake either, but it’s important to recognise that qualitative analysis still requires a significant investment in terms of time and effort.

Need a helping hand?

type of research analysis

In this post, we’ll explore qualitative data analysis by looking at some of the most common analysis methods we encounter. We’re not going to cover every possible qualitative method and we’re not going to go into heavy detail – we’re just going to give you the big picture. That said, we will of course includes links to loads of extra resources so that you can learn more about whichever analysis method interests you.

Without further delay, let’s get into it.

The “Big 6” Qualitative Analysis Methods 

There are many different types of qualitative data analysis, all of which serve different purposes and have unique strengths and weaknesses . We’ll start by outlining the analysis methods and then we’ll dive into the details for each.

The 6 most popular methods (or at least the ones we see at Grad Coach) are:

  • Content analysis
  • Narrative analysis
  • Discourse analysis
  • Thematic analysis
  • Grounded theory (GT)
  • Interpretive phenomenological analysis (IPA)

Let’s take a look at each of them…

QDA Method #1: Qualitative Content Analysis

Content analysis is possibly the most common and straightforward QDA method. At the simplest level, content analysis is used to evaluate patterns within a piece of content (for example, words, phrases or images) or across multiple pieces of content or sources of communication. For example, a collection of newspaper articles or political speeches.

With content analysis, you could, for instance, identify the frequency with which an idea is shared or spoken about – like the number of times a Kardashian is mentioned on Twitter. Or you could identify patterns of deeper underlying interpretations – for instance, by identifying phrases or words in tourist pamphlets that highlight India as an ancient country.

Because content analysis can be used in such a wide variety of ways, it’s important to go into your analysis with a very specific question and goal, or you’ll get lost in the fog. With content analysis, you’ll group large amounts of text into codes , summarise these into categories, and possibly even tabulate the data to calculate the frequency of certain concepts or variables. Because of this, content analysis provides a small splash of quantitative thinking within a qualitative method.

Naturally, while content analysis is widely useful, it’s not without its drawbacks . One of the main issues with content analysis is that it can be very time-consuming , as it requires lots of reading and re-reading of the texts. Also, because of its multidimensional focus on both qualitative and quantitative aspects, it is sometimes accused of losing important nuances in communication.

Content analysis also tends to concentrate on a very specific timeline and doesn’t take into account what happened before or after that timeline. This isn’t necessarily a bad thing though – just something to be aware of. So, keep these factors in mind if you’re considering content analysis. Every analysis method has its limitations , so don’t be put off by these – just be aware of them ! If you’re interested in learning more about content analysis, the video below provides a good starting point.

QDA Method #2: Narrative Analysis 

As the name suggests, narrative analysis is all about listening to people telling stories and analysing what that means . Since stories serve a functional purpose of helping us make sense of the world, we can gain insights into the ways that people deal with and make sense of reality by analysing their stories and the ways they’re told.

You could, for example, use narrative analysis to explore whether how something is being said is important. For instance, the narrative of a prisoner trying to justify their crime could provide insight into their view of the world and the justice system. Similarly, analysing the ways entrepreneurs talk about the struggles in their careers or cancer patients telling stories of hope could provide powerful insights into their mindsets and perspectives . Simply put, narrative analysis is about paying attention to the stories that people tell – and more importantly, the way they tell them.

Of course, the narrative approach has its weaknesses , too. Sample sizes are generally quite small due to the time-consuming process of capturing narratives. Because of this, along with the multitude of social and lifestyle factors which can influence a subject, narrative analysis can be quite difficult to reproduce in subsequent research. This means that it’s difficult to test the findings of some of this research.

Similarly, researcher bias can have a strong influence on the results here, so you need to be particularly careful about the potential biases you can bring into your analysis when using this method. Nevertheless, narrative analysis is still a very useful qualitative analysis method – just keep these limitations in mind and be careful not to draw broad conclusions . If you’re keen to learn more about narrative analysis, the video below provides a great introduction to this qualitative analysis method.

QDA Method #3: Discourse Analysis 

Discourse is simply a fancy word for written or spoken language or debate . So, discourse analysis is all about analysing language within its social context. In other words, analysing language – such as a conversation, a speech, etc – within the culture and society it takes place. For example, you could analyse how a janitor speaks to a CEO, or how politicians speak about terrorism.

To truly understand these conversations or speeches, the culture and history of those involved in the communication are important factors to consider. For example, a janitor might speak more casually with a CEO in a company that emphasises equality among workers. Similarly, a politician might speak more about terrorism if there was a recent terrorist incident in the country.

So, as you can see, by using discourse analysis, you can identify how culture , history or power dynamics (to name a few) have an effect on the way concepts are spoken about. So, if your research aims and objectives involve understanding culture or power dynamics, discourse analysis can be a powerful method.

Because there are many social influences in terms of how we speak to each other, the potential use of discourse analysis is vast . Of course, this also means it’s important to have a very specific research question (or questions) in mind when analysing your data and looking for patterns and themes, or you might land up going down a winding rabbit hole.

Discourse analysis can also be very time-consuming  as you need to sample the data to the point of saturation – in other words, until no new information and insights emerge. But this is, of course, part of what makes discourse analysis such a powerful technique. So, keep these factors in mind when considering this QDA method. Again, if you’re keen to learn more, the video below presents a good starting point.

QDA Method #4: Thematic Analysis

Thematic analysis looks at patterns of meaning in a data set – for example, a set of interviews or focus group transcripts. But what exactly does that… mean? Well, a thematic analysis takes bodies of data (which are often quite large) and groups them according to similarities – in other words, themes . These themes help us make sense of the content and derive meaning from it.

Let’s take a look at an example.

With thematic analysis, you could analyse 100 online reviews of a popular sushi restaurant to find out what patrons think about the place. By reviewing the data, you would then identify the themes that crop up repeatedly within the data – for example, “fresh ingredients” or “friendly wait staff”.

So, as you can see, thematic analysis can be pretty useful for finding out about people’s experiences , views, and opinions . Therefore, if your research aims and objectives involve understanding people’s experience or view of something, thematic analysis can be a great choice.

Since thematic analysis is a bit of an exploratory process, it’s not unusual for your research questions to develop , or even change as you progress through the analysis. While this is somewhat natural in exploratory research, it can also be seen as a disadvantage as it means that data needs to be re-reviewed each time a research question is adjusted. In other words, thematic analysis can be quite time-consuming – but for a good reason. So, keep this in mind if you choose to use thematic analysis for your project and budget extra time for unexpected adjustments.

Thematic analysis takes bodies of data and groups them according to similarities (themes), which help us make sense of the content.

QDA Method #5: Grounded theory (GT) 

Grounded theory is a powerful qualitative analysis method where the intention is to create a new theory (or theories) using the data at hand, through a series of “ tests ” and “ revisions ”. Strictly speaking, GT is more a research design type than an analysis method, but we’ve included it here as it’s often referred to as a method.

What’s most important with grounded theory is that you go into the analysis with an open mind and let the data speak for itself – rather than dragging existing hypotheses or theories into your analysis. In other words, your analysis must develop from the ground up (hence the name). 

Let’s look at an example of GT in action.

Assume you’re interested in developing a theory about what factors influence students to watch a YouTube video about qualitative analysis. Using Grounded theory , you’d start with this general overarching question about the given population (i.e., graduate students). First, you’d approach a small sample – for example, five graduate students in a department at a university. Ideally, this sample would be reasonably representative of the broader population. You’d interview these students to identify what factors lead them to watch the video.

After analysing the interview data, a general pattern could emerge. For example, you might notice that graduate students are more likely to read a post about qualitative methods if they are just starting on their dissertation journey, or if they have an upcoming test about research methods.

From here, you’ll look for another small sample – for example, five more graduate students in a different department – and see whether this pattern holds true for them. If not, you’ll look for commonalities and adapt your theory accordingly. As this process continues, the theory would develop . As we mentioned earlier, what’s important with grounded theory is that the theory develops from the data – not from some preconceived idea.

So, what are the drawbacks of grounded theory? Well, some argue that there’s a tricky circularity to grounded theory. For it to work, in principle, you should know as little as possible regarding the research question and population, so that you reduce the bias in your interpretation. However, in many circumstances, it’s also thought to be unwise to approach a research question without knowledge of the current literature . In other words, it’s a bit of a “chicken or the egg” situation.

Regardless, grounded theory remains a popular (and powerful) option. Naturally, it’s a very useful method when you’re researching a topic that is completely new or has very little existing research about it, as it allows you to start from scratch and work your way from the ground up .

Grounded theory is used to create a new theory (or theories) by using the data at hand, as opposed to existing theories and frameworks.

QDA Method #6:   Interpretive Phenomenological Analysis (IPA)

Interpretive. Phenomenological. Analysis. IPA . Try saying that three times fast…

Let’s just stick with IPA, okay?

IPA is designed to help you understand the personal experiences of a subject (for example, a person or group of people) concerning a major life event, an experience or a situation . This event or experience is the “phenomenon” that makes up the “P” in IPA. Such phenomena may range from relatively common events – such as motherhood, or being involved in a car accident – to those which are extremely rare – for example, someone’s personal experience in a refugee camp. So, IPA is a great choice if your research involves analysing people’s personal experiences of something that happened to them.

It’s important to remember that IPA is subject – centred . In other words, it’s focused on the experiencer . This means that, while you’ll likely use a coding system to identify commonalities, it’s important not to lose the depth of experience or meaning by trying to reduce everything to codes. Also, keep in mind that since your sample size will generally be very small with IPA, you often won’t be able to draw broad conclusions about the generalisability of your findings. But that’s okay as long as it aligns with your research aims and objectives.

Another thing to be aware of with IPA is personal bias . While researcher bias can creep into all forms of research, self-awareness is critically important with IPA, as it can have a major impact on the results. For example, a researcher who was a victim of a crime himself could insert his own feelings of frustration and anger into the way he interprets the experience of someone who was kidnapped. So, if you’re going to undertake IPA, you need to be very self-aware or you could muddy the analysis.

IPA can help you understand the personal experiences of a person or group concerning a major life event, an experience or a situation.

How to choose the right analysis method

In light of all of the qualitative analysis methods we’ve covered so far, you’re probably asking yourself the question, “ How do I choose the right one? ”

Much like all the other methodological decisions you’ll need to make, selecting the right qualitative analysis method largely depends on your research aims, objectives and questions . In other words, the best tool for the job depends on what you’re trying to build. For example:

  • Perhaps your research aims to analyse the use of words and what they reveal about the intention of the storyteller and the cultural context of the time.
  • Perhaps your research aims to develop an understanding of the unique personal experiences of people that have experienced a certain event, or
  • Perhaps your research aims to develop insight regarding the influence of a certain culture on its members.

As you can probably see, each of these research aims are distinctly different , and therefore different analysis methods would be suitable for each one. For example, narrative analysis would likely be a good option for the first aim, while grounded theory wouldn’t be as relevant. 

It’s also important to remember that each method has its own set of strengths, weaknesses and general limitations. No single analysis method is perfect . So, depending on the nature of your research, it may make sense to adopt more than one method (this is called triangulation ). Keep in mind though that this will of course be quite time-consuming.

As we’ve seen, all of the qualitative analysis methods we’ve discussed make use of coding and theme-generating techniques, but the intent and approach of each analysis method differ quite substantially. So, it’s very important to come into your research with a clear intention before you decide which analysis method (or methods) to use.

Start by reviewing your research aims , objectives and research questions to assess what exactly you’re trying to find out – then select a qualitative analysis method that fits. Never pick a method just because you like it or have experience using it – your analysis method (or methods) must align with your broader research aims and objectives.

No single analysis method is perfect, so it can often make sense to adopt more than one  method (this is called triangulation).

Let’s recap on QDA methods…

In this post, we looked at six popular qualitative data analysis methods:

  • First, we looked at content analysis , a straightforward method that blends a little bit of quant into a primarily qualitative analysis.
  • Then we looked at narrative analysis , which is about analysing how stories are told.
  • Next up was discourse analysis – which is about analysing conversations and interactions.
  • Then we moved on to thematic analysis – which is about identifying themes and patterns.
  • From there, we went south with grounded theory – which is about starting from scratch with a specific question and using the data alone to build a theory in response to that question.
  • And finally, we looked at IPA – which is about understanding people’s unique experiences of a phenomenon.

Of course, these aren’t the only options when it comes to qualitative data analysis, but they’re a great starting point if you’re dipping your toes into qualitative research for the first time.

If you’re still feeling a bit confused, consider our private coaching service , where we hold your hand through the research process to help you develop your best work.

type of research analysis

Psst... there’s more!

This post was based on one of our popular Research Bootcamps . If you're working on a research project, you'll definitely want to check this out ...

You Might Also Like:

Sampling methods and strategies in research

84 Comments

Richard N

This has been very helpful. Thank you.

netaji

Thank you madam,

Mariam Jaiyeola

Thank you so much for this information

Nzube

I wonder it so clear for understand and good for me. can I ask additional query?

Lee

Very insightful and useful

Susan Nakaweesi

Good work done with clear explanations. Thank you.

Titilayo

Thanks so much for the write-up, it’s really good.

Hemantha Gunasekara

Thanks madam . It is very important .

Gumathandra

thank you very good

Pramod Bahulekar

This has been very well explained in simple language . It is useful even for a new researcher.

Derek Jansen

Great to hear that. Good luck with your qualitative data analysis, Pramod!

Adam Zahir

This is very useful information. And it was very a clear language structured presentation. Thanks a lot.

Golit,F.

Thank you so much.

Emmanuel

very informative sequential presentation

Shahzada

Precise explanation of method.

Alyssa

Hi, may we use 2 data analysis methods in our qualitative research?

Thanks for your comment. Most commonly, one would use one type of analysis method, but it depends on your research aims and objectives.

Dr. Manju Pandey

You explained it in very simple language, everyone can understand it. Thanks so much.

Phillip

Thank you very much, this is very helpful. It has been explained in a very simple manner that even a layman understands

Anne

Thank nicely explained can I ask is Qualitative content analysis the same as thematic analysis?

Thanks for your comment. No, QCA and thematic are two different types of analysis. This article might help clarify – https://onlinelibrary.wiley.com/doi/10.1111/nhs.12048

Rev. Osadare K . J

This is my first time to come across a well explained data analysis. so helpful.

Tina King

I have thoroughly enjoyed your explanation of the six qualitative analysis methods. This is very helpful. Thank you!

Bromie

Thank you very much, this is well explained and useful

udayangani

i need a citation of your book.

khutsafalo

Thanks a lot , remarkable indeed, enlighting to the best

jas

Hi Derek, What other theories/methods would you recommend when the data is a whole speech?

M

Keep writing useful artikel.

Adane

It is important concept about QDA and also the way to express is easily understandable, so thanks for all.

Carl Benecke

Thank you, this is well explained and very useful.

Ngwisa

Very helpful .Thanks.

Hajra Aman

Hi there! Very well explained. Simple but very useful style of writing. Please provide the citation of the text. warm regards

Hillary Mophethe

The session was very helpful and insightful. Thank you

This was very helpful and insightful. Easy to read and understand

Catherine

As a professional academic writer, this has been so informative and educative. Keep up the good work Grad Coach you are unmatched with quality content for sure.

Keep up the good work Grad Coach you are unmatched with quality content for sure.

Abdulkerim

Its Great and help me the most. A Million Thanks you Dr.

Emanuela

It is a very nice work

Noble Naade

Very insightful. Please, which of this approach could be used for a research that one is trying to elicit students’ misconceptions in a particular concept ?

Karen

This is Amazing and well explained, thanks

amirhossein

great overview

Tebogo

What do we call a research data analysis method that one use to advise or determining the best accounting tool or techniques that should be adopted in a company.

Catherine Shimechero

Informative video, explained in a clear and simple way. Kudos

Van Hmung

Waoo! I have chosen method wrong for my data analysis. But I can revise my work according to this guide. Thank you so much for this helpful lecture.

BRIAN ONYANGO MWAGA

This has been very helpful. It gave me a good view of my research objectives and how to choose the best method. Thematic analysis it is.

Livhuwani Reineth

Very helpful indeed. Thanku so much for the insight.

Storm Erlank

This was incredibly helpful.

Jack Kanas

Very helpful.

catherine

very educative

Wan Roslina

Nicely written especially for novice academic researchers like me! Thank you.

Talash

choosing a right method for a paper is always a hard job for a student, this is a useful information, but it would be more useful personally for me, if the author provide me with a little bit more information about the data analysis techniques in type of explanatory research. Can we use qualitative content analysis technique for explanatory research ? or what is the suitable data analysis method for explanatory research in social studies?

ramesh

that was very helpful for me. because these details are so important to my research. thank you very much

Kumsa Desisa

I learnt a lot. Thank you

Tesfa NT

Relevant and Informative, thanks !

norma

Well-planned and organized, thanks much! 🙂

Dr. Jacob Lubuva

I have reviewed qualitative data analysis in a simplest way possible. The content will highly be useful for developing my book on qualitative data analysis methods. Cheers!

Nyi Nyi Lwin

Clear explanation on qualitative and how about Case study

Ogobuchi Otuu

This was helpful. Thank you

Alicia

This was really of great assistance, it was just the right information needed. Explanation very clear and follow.

Wow, Thanks for making my life easy

C. U

This was helpful thanks .

Dr. Alina Atif

Very helpful…. clear and written in an easily understandable manner. Thank you.

Herb

This was so helpful as it was easy to understand. I’m a new to research thank you so much.

cissy

so educative…. but Ijust want to know which method is coding of the qualitative or tallying done?

Ayo

Thank you for the great content, I have learnt a lot. So helpful

Tesfaye

precise and clear presentation with simple language and thank you for that.

nneheng

very informative content, thank you.

Oscar Kuebutornye

You guys are amazing on YouTube on this platform. Your teachings are great, educative, and informative. kudos!

NG

Brilliant Delivery. You made a complex subject seem so easy. Well done.

Ankit Kumar

Beautifully explained.

Thanks a lot

Kidada Owen-Browne

Is there a video the captures the practical process of coding using automated applications?

Thanks for the comment. We don’t recommend using automated applications for coding, as they are not sufficiently accurate in our experience.

Mathewos Damtew

content analysis can be qualitative research?

Hend

THANK YOU VERY MUCH.

Dev get

Thank you very much for such a wonderful content

Kassahun Aman

do you have any material on Data collection

Prince .S. mpofu

What a powerful explanation of the QDA methods. Thank you.

Kassahun

Great explanation both written and Video. i have been using of it on a day to day working of my thesis project in accounting and finance. Thank you very much for your support.

BORA SAMWELI MATUTULI

very helpful, thank you so much

Submit a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

  • Print Friendly

type of research analysis

Community Blog

Keep up-to-date on postgraduate related issues with our quick reads written by students, postdocs, professors and industry leaders.

Types of Research – Explained with Examples

DiscoverPhDs

  • By DiscoverPhDs
  • October 2, 2020

Types of Research Design

Types of Research

Research is about using established methods to investigate a problem or question in detail with the aim of generating new knowledge about it.

It is a vital tool for scientific advancement because it allows researchers to prove or refute hypotheses based on clearly defined parameters, environments and assumptions. Due to this, it enables us to confidently contribute to knowledge as it allows research to be verified and replicated.

Knowing the types of research and what each of them focuses on will allow you to better plan your project, utilises the most appropriate methodologies and techniques and better communicate your findings to other researchers and supervisors.

Classification of Types of Research

There are various types of research that are classified according to their objective, depth of study, analysed data, time required to study the phenomenon and other factors. It’s important to note that a research project will not be limited to one type of research, but will likely use several.

According to its Purpose

Theoretical research.

Theoretical research, also referred to as pure or basic research, focuses on generating knowledge , regardless of its practical application. Here, data collection is used to generate new general concepts for a better understanding of a particular field or to answer a theoretical research question.

Results of this kind are usually oriented towards the formulation of theories and are usually based on documentary analysis, the development of mathematical formulas and the reflection of high-level researchers.

Applied Research

Here, the goal is to find strategies that can be used to address a specific research problem. Applied research draws on theory to generate practical scientific knowledge, and its use is very common in STEM fields such as engineering, computer science and medicine.

This type of research is subdivided into two types:

  • Technological applied research : looks towards improving efficiency in a particular productive sector through the improvement of processes or machinery related to said productive processes.
  • Scientific applied research : has predictive purposes. Through this type of research design, we can measure certain variables to predict behaviours useful to the goods and services sector, such as consumption patterns and viability of commercial projects.

Methodology Research

According to your Depth of Scope

Exploratory research.

Exploratory research is used for the preliminary investigation of a subject that is not yet well understood or sufficiently researched. It serves to establish a frame of reference and a hypothesis from which an in-depth study can be developed that will enable conclusive results to be generated.

Because exploratory research is based on the study of little-studied phenomena, it relies less on theory and more on the collection of data to identify patterns that explain these phenomena.

Descriptive Research

The primary objective of descriptive research is to define the characteristics of a particular phenomenon without necessarily investigating the causes that produce it.

In this type of research, the researcher must take particular care not to intervene in the observed object or phenomenon, as its behaviour may change if an external factor is involved.

Explanatory Research

Explanatory research is the most common type of research method and is responsible for establishing cause-and-effect relationships that allow generalisations to be extended to similar realities. It is closely related to descriptive research, although it provides additional information about the observed object and its interactions with the environment.

Correlational Research

The purpose of this type of scientific research is to identify the relationship between two or more variables. A correlational study aims to determine whether a variable changes, how much the other elements of the observed system change.

According to the Type of Data Used

Qualitative research.

Qualitative methods are often used in the social sciences to collect, compare and interpret information, has a linguistic-semiotic basis and is used in techniques such as discourse analysis, interviews, surveys, records and participant observations.

In order to use statistical methods to validate their results, the observations collected must be evaluated numerically. Qualitative research, however, tends to be subjective, since not all data can be fully controlled. Therefore, this type of research design is better suited to extracting meaning from an event or phenomenon (the ‘why’) than its cause (the ‘how’).

Quantitative Research

Quantitative research study delves into a phenomena through quantitative data collection and using mathematical, statistical and computer-aided tools to measure them . This allows generalised conclusions to be projected over time.

Types of Research Methodology

According to the Degree of Manipulation of Variables

Experimental research.

It is about designing or replicating a phenomenon whose variables are manipulated under strictly controlled conditions in order to identify or discover its effect on another independent variable or object. The phenomenon to be studied is measured through study and control groups, and according to the guidelines of the scientific method.

Non-Experimental Research

Also known as an observational study, it focuses on the analysis of a phenomenon in its natural context. As such, the researcher does not intervene directly, but limits their involvement to measuring the variables required for the study. Due to its observational nature, it is often used in descriptive research.

Quasi-Experimental Research

It controls only some variables of the phenomenon under investigation and is therefore not entirely experimental. In this case, the study and the focus group cannot be randomly selected, but are chosen from existing groups or populations . This is to ensure the collected data is relevant and that the knowledge, perspectives and opinions of the population can be incorporated into the study.

According to the Type of Inference

Deductive investigation.

In this type of research, reality is explained by general laws that point to certain conclusions; conclusions are expected to be part of the premise of the research problem and considered correct if the premise is valid and the inductive method is applied correctly.

Inductive Research

In this type of research, knowledge is generated from an observation to achieve a generalisation. It is based on the collection of specific data to develop new theories.

Hypothetical-Deductive Investigation

It is based on observing reality to make a hypothesis, then use deduction to obtain a conclusion and finally verify or reject it through experience.

Descriptive Research Design

According to the Time in Which it is Carried Out

Longitudinal study (also referred to as diachronic research).

It is the monitoring of the same event, individual or group over a defined period of time. It aims to track changes in a number of variables and see how they evolve over time. It is often used in medical, psychological and social areas .

Cross-Sectional Study (also referred to as Synchronous Research)

Cross-sectional research design is used to observe phenomena, an individual or a group of research subjects at a given time.

According to The Sources of Information

Primary research.

This fundamental research type is defined by the fact that the data is collected directly from the source, that is, it consists of primary, first-hand information.

Secondary research

Unlike primary research, secondary research is developed with information from secondary sources, which are generally based on scientific literature and other documents compiled by another researcher.

Action Research Methods

According to How the Data is Obtained

Documentary (cabinet).

Documentary research, or secondary sources, is based on a systematic review of existing sources of information on a particular subject. This type of scientific research is commonly used when undertaking literature reviews or producing a case study.

Field research study involves the direct collection of information at the location where the observed phenomenon occurs.

From Laboratory

Laboratory research is carried out in a controlled environment in order to isolate a dependent variable and establish its relationship with other variables through scientific methods.

Mixed-Method: Documentary, Field and/or Laboratory

Mixed research methodologies combine results from both secondary (documentary) sources and primary sources through field or laboratory research.

Academic Conference

Academic conferences are expensive and it can be tough finding the funds to go; this naturally leads to the question of are academic conferences worth it?

What is a Research Instrument?

The term research instrument refers to any tool that you may use to collect, measure and analyse research data.

Tips for New Graduate Teaching Assistants at University

Being a new graduate teaching assistant can be a scary but rewarding undertaking – our 7 tips will help make your teaching journey as smooth as possible.

Join thousands of other students and stay up to date with the latest PhD programmes, funding opportunities and advice.

type of research analysis

Browse PhDs Now

Abstract vs Introduction

An abstract and introduction are the first two sections of your paper or thesis. This guide explains the differences between them and how to write them.

Fabian van den Berg_Profile

Fabian’s in the final year of his PhD research at Maastricht University. His project is about how humans learn numbers and how hands might help that process; this is especially useful for children developing their maths skills.

DiscoverPhDs_Dr_Jennifer_Dillon-Profile

Dr Dillon gained her PhD in Molecular Cancer Studies at the University of Manchester in 2015. She now works at a biotech company called HairClone, optimising treatments for androgenic alopecia.

Join Thousands of Students

Marketing91

8 Types of Analysis in Research

June 12, 2023 | By Hitesh Bhasin | Filed Under: Marketing

Data analysis detailed process of analyzing cleaning transforming and presenting useful information with the goal of forming conclusions and supporting decision making . Data can be analyzed by multiple approaches for multiple domains. It is very essential for every business is today to analyze the data that is obtained from various means.

Data analysis is useful in drawing certain conclusions about the variables that are present in the research. The approach to analysis, however, depends on the research that is being carried out. Without using data analytics, it is difficult to determine the relationship between variables which would lead to a meaningful conclusion. Thus, data analysis is an important tool to arrive at a particular conclusion.

Data can be analyzed in various ways. Following are a few methods by which data can be analyzed :

Table of Contents

1) Exploratory Data Analysis (EDA)

It is one of the types of analysis in research which is used to analyze data and established relationships which were previously unknown. They are specifically used to discover and for new connections and for defining future studies or answering the questions pertaining to future studies.

The answers provided by exploratory analysis are not definitive in nature but they provide little insight into what is coming. The approach to analyzing data sets with visual methods is the commonly used technique for EDA. Exploratory data analysis was promoted by John Tukey and was defined in 1961.

Graphical techniques of representation are used primarily in exploratory data analysis and most used graphical techniques are a histogram, Pareto chart, stem and leaf plot, scatter plot, box plot, etc. The drawback of exploratory analysis is that it cannot be used for generalizing or predicting precisely about the upcoming events. The data provides correlation which does not imply causation. Exploratory data analysis can be applied to study census along with convenience sample data set.

Software and machine-aided have become very common in EDA analysis. Few of them are Data Applied, Ggobi, JMP, KNIME, Python etc.

2) Descriptive data analysis

This method requires the least amount of effort amongst all other methods of data analysis. It describes the main features of the collection of data, quantitatively. This is usually the initial kind of data analysis that is performed on the available data set. Descriptive data analysis is usually applied to the volumes of data such as census data. Descriptive data analysis has different steps for description and interpretation. There are two methods of statistical descriptive analysis that is univariate and bivariate. Both are types of analysis in research.

A) Univariate descriptive data analysis

The analysis which involves the distribution of a single variable is called univariate analysis.

B)  Bivariate and multivariate analysis

When the data analysis involves a description of the distribution of more than one variable it is termed as bivariate and multivariate analysis. Descriptive statistics, in such cases, may be used to describe the relationship between the pair of variables.

3) Causal data analysis

Analysis - 2

Causal data analysis is also known as explanatory Data Analysis. Causal determines the cause and effect relationship between the variables. The analysis is primarily carried out to see what would happen to another variable if one variable would change.

Application of causal studies usually requires randomized studies but there are also approaches to concluding causation even and non-randomized studies. Causal models set to be the gold standard amongst all other types of data analysis. It is considered to be very complex and the researcher cannot be certain that other variables influencing the causal relationship are constant especially when the research is dealing with the attitudes of customers in business.

Often, the researcher has to consider psychological impacts that even the respondent may not be aware of at any point and these unconsidered parameters impact the data that is analyzed and may affect the conclusions.

4) Predictive data analysis

As the name suggests Predictive data analysis involves employing methods which analyze the current trends along with the historical facts to arrive at a conclusion that makes predictions about the future trends of future events.

The prediction and the success of the model depend on choosing and measuring the right variables. Predicting future trends is very difficult and requires technical expertise in the subject. Machine learning is a modern tool used interactive analysis for better results. Prediction analysis is used to predict the rising and changing trends in various industries.

Analytical customer relationship management , clinical decision support systems , collection analytics, fraud detection, portfolio management are a few of the applications of Predictive Data Analysis. Forecasting about the future financial trends is also a very important application of predictive data analysis.

Few of the software used to Predictive analysis are Apache Mahout, GNU Octave, OpenNN, MATLAB etc.

5) Inferential data analysis

Inferential data analysis is amongst the types of analysis in research that helps to test theories of different subjects based on the sample taken from the group of subjects. A small part of a population is studied and the conclusions are extrapolated for the bigger chunk of the population.

The goals of statistical models are to provide an inference or a conclusion based on a study in the small amount of representative population. Since the process involves drawing conclusions or inferences, selecting a proper statistical model for the process is very important.

The success of inferential data analysis will depend on proper statistical models used for analysis. The results of inferential analysis depend on the population and the sampling technique. It is very crucial that a variety of representative subjects are taken to study to have better results.

The data analysis is applied to the cross-sectional study of time retrospective data set and observational data analysis. Inferential data analysis can determine and predict excellent results if and only if the proper sampling technique is followed along with good tools for data analysis.

6) Decision trees

This is classified as a modern classification algorithm in data mining and is a very popular type of analysis in research which requires machine learning. It is usually represented as a tree-shaped diagram of a figure that provides information about regression models or classification.

The decision tree may be subdivided into the smaller database is that has similar values. The branches determine how the tree is built where does one go with the current choices and where would those choices lead to next.

The primary advantage of a decision tree is the domain knowledge is not an essential requirement for analysis. Also, the classification of the decision tree is a very simple and fast process which consumes less time compared to other data analysis techniques.

7) Mechanistic data analysis

This method is exactly opposite to the descriptive data analysis, which required the least amount of effort, mechanistic data analysis requires a maximum amount of effort. The primary idea behind mechanistic data analysis is to understand the nature of exact changes in variables that affect other variables.

Mechanistic data analysis is exceptionally difficult to predict except when the situations are simpler. This analysis used by physical and engineering science in case of the deterministic set of equations. The applications of this type of analysis are randomized trial data set.

8) Evolutionary programming

It combines different types of analysis in research using evolutionary algorithms to form meaningful data and is a very common concept in data mining. Genetic algorithms and evolutionary algorithms are the most popular programs of revolutionary programming. These are an accident in case of independent techniques since they have the ability to search and explore large spaces for discovering good solutions.

Liked this post? Check out the complete series on Market research

Related posts:

  • What is Research Design? Type of Research Designs
  • How to Write Research Proposal? Research Proposal Format
  • 7 Key Differences between Research Method and Research Methodology
  • Qualitative Research: Meaning, and Features of Qualitative Research
  • Research Ethics – Importance and Principles of Ethics in Research
  • What is Primary Market Research? Types & Examples
  • Different types of marketing research and when to use them
  • 11 Types Of Quantitative Research options that exist for Market Researchers
  • Research and Development – Meaning and Types
  • What is Survey Research? Objectives, Sampling Process, Types and Advantages

' src=

About Hitesh Bhasin

Hitesh Bhasin is the CEO of Marketing91 and has over a decade of experience in the marketing field. He is an accomplished author of thousands of insightful articles, including in-depth analyses of brands and companies. Holding an MBA in Marketing, Hitesh manages several offline ventures, where he applies all the concepts of Marketing that he writes about.

All Knowledge Banks (Hub Pages)

  • Marketing Hub
  • Management Hub
  • Marketing Strategy
  • Advertising Hub
  • Branding Hub
  • Market Research
  • Small Business Marketing
  • Sales and Selling
  • Marketing Careers
  • Internet Marketing
  • Business Model of Brands
  • Marketing Mix of Brands
  • Brand Competitors
  • Strategy of Brands
  • SWOT of Brands
  • Customer Management
  • Top 10 Lists

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Marketing91

  • About Marketing91
  • Marketing91 Team
  • Privacy Policy
  • Cookie Policy
  • Terms of Use
  • Editorial Policy

WE WRITE ON

  • Digital Marketing
  • Human Resources
  • Operations Management
  • Marketing News
  • Marketing mix's
  • Competitors

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Korean J Anesthesiol
  • v.71(2); 2018 Apr

Introduction to systematic review and meta-analysis

1 Department of Anesthesiology and Pain Medicine, Inje University Seoul Paik Hospital, Seoul, Korea

2 Department of Anesthesiology and Pain Medicine, Chung-Ang University College of Medicine, Seoul, Korea

Systematic reviews and meta-analyses present results by combining and analyzing data from different studies conducted on similar research topics. In recent years, systematic reviews and meta-analyses have been actively performed in various fields including anesthesiology. These research methods are powerful tools that can overcome the difficulties in performing large-scale randomized controlled trials. However, the inclusion of studies with any biases or improperly assessed quality of evidence in systematic reviews and meta-analyses could yield misleading results. Therefore, various guidelines have been suggested for conducting systematic reviews and meta-analyses to help standardize them and improve their quality. Nonetheless, accepting the conclusions of many studies without understanding the meta-analysis can be dangerous. Therefore, this article provides an easy introduction to clinicians on performing and understanding meta-analyses.

Introduction

A systematic review collects all possible studies related to a given topic and design, and reviews and analyzes their results [ 1 ]. During the systematic review process, the quality of studies is evaluated, and a statistical meta-analysis of the study results is conducted on the basis of their quality. A meta-analysis is a valid, objective, and scientific method of analyzing and combining different results. Usually, in order to obtain more reliable results, a meta-analysis is mainly conducted on randomized controlled trials (RCTs), which have a high level of evidence [ 2 ] ( Fig. 1 ). Since 1999, various papers have presented guidelines for reporting meta-analyses of RCTs. Following the Quality of Reporting of Meta-analyses (QUORUM) statement [ 3 ], and the appearance of registers such as Cochrane Library’s Methodology Register, a large number of systematic literature reviews have been registered. In 2009, the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement [ 4 ] was published, and it greatly helped standardize and improve the quality of systematic reviews and meta-analyses [ 5 ].

An external file that holds a picture, illustration, etc.
Object name is kjae-2018-71-2-103f1.jpg

Levels of evidence.

In anesthesiology, the importance of systematic reviews and meta-analyses has been highlighted, and they provide diagnostic and therapeutic value to various areas, including not only perioperative management but also intensive care and outpatient anesthesia [6–13]. Systematic reviews and meta-analyses include various topics, such as comparing various treatments of postoperative nausea and vomiting [ 14 , 15 ], comparing general anesthesia and regional anesthesia [ 16 – 18 ], comparing airway maintenance devices [ 8 , 19 ], comparing various methods of postoperative pain control (e.g., patient-controlled analgesia pumps, nerve block, or analgesics) [ 20 – 23 ], comparing the precision of various monitoring instruments [ 7 ], and meta-analysis of dose-response in various drugs [ 12 ].

Thus, literature reviews and meta-analyses are being conducted in diverse medical fields, and the aim of highlighting their importance is to help better extract accurate, good quality data from the flood of data being produced. However, a lack of understanding about systematic reviews and meta-analyses can lead to incorrect outcomes being derived from the review and analysis processes. If readers indiscriminately accept the results of the many meta-analyses that are published, incorrect data may be obtained. Therefore, in this review, we aim to describe the contents and methods used in systematic reviews and meta-analyses in a way that is easy to understand for future authors and readers of systematic review and meta-analysis.

Study Planning

It is easy to confuse systematic reviews and meta-analyses. A systematic review is an objective, reproducible method to find answers to a certain research question, by collecting all available studies related to that question and reviewing and analyzing their results. A meta-analysis differs from a systematic review in that it uses statistical methods on estimates from two or more different studies to form a pooled estimate [ 1 ]. Following a systematic review, if it is not possible to form a pooled estimate, it can be published as is without progressing to a meta-analysis; however, if it is possible to form a pooled estimate from the extracted data, a meta-analysis can be attempted. Systematic reviews and meta-analyses usually proceed according to the flowchart presented in Fig. 2 . We explain each of the stages below.

An external file that holds a picture, illustration, etc.
Object name is kjae-2018-71-2-103f2.jpg

Flowchart illustrating a systematic review.

Formulating research questions

A systematic review attempts to gather all available empirical research by using clearly defined, systematic methods to obtain answers to a specific question. A meta-analysis is the statistical process of analyzing and combining results from several similar studies. Here, the definition of the word “similar” is not made clear, but when selecting a topic for the meta-analysis, it is essential to ensure that the different studies present data that can be combined. If the studies contain data on the same topic that can be combined, a meta-analysis can even be performed using data from only two studies. However, study selection via a systematic review is a precondition for performing a meta-analysis, and it is important to clearly define the Population, Intervention, Comparison, Outcomes (PICO) parameters that are central to evidence-based research. In addition, selection of the research topic is based on logical evidence, and it is important to select a topic that is familiar to readers without clearly confirmed the evidence [ 24 ].

Protocols and registration

In systematic reviews, prior registration of a detailed research plan is very important. In order to make the research process transparent, primary/secondary outcomes and methods are set in advance, and in the event of changes to the method, other researchers and readers are informed when, how, and why. Many studies are registered with an organization like PROSPERO ( http://www.crd.york.ac.uk/PROSPERO/ ), and the registration number is recorded when reporting the study, in order to share the protocol at the time of planning.

Defining inclusion and exclusion criteria

Information is included on the study design, patient characteristics, publication status (published or unpublished), language used, and research period. If there is a discrepancy between the number of patients included in the study and the number of patients included in the analysis, this needs to be clearly explained while describing the patient characteristics, to avoid confusing the reader.

Literature search and study selection

In order to secure proper basis for evidence-based research, it is essential to perform a broad search that includes as many studies as possible that meet the inclusion and exclusion criteria. Typically, the three bibliographic databases Medline, Embase, and Cochrane Central Register of Controlled Trials (CENTRAL) are used. In domestic studies, the Korean databases KoreaMed, KMBASE, and RISS4U may be included. Effort is required to identify not only published studies but also abstracts, ongoing studies, and studies awaiting publication. Among the studies retrieved in the search, the researchers remove duplicate studies, select studies that meet the inclusion/exclusion criteria based on the abstracts, and then make the final selection of studies based on their full text. In order to maintain transparency and objectivity throughout this process, study selection is conducted independently by at least two investigators. When there is a inconsistency in opinions, intervention is required via debate or by a third reviewer. The methods for this process also need to be planned in advance. It is essential to ensure the reproducibility of the literature selection process [ 25 ].

Quality of evidence

However, well planned the systematic review or meta-analysis is, if the quality of evidence in the studies is low, the quality of the meta-analysis decreases and incorrect results can be obtained [ 26 ]. Even when using randomized studies with a high quality of evidence, evaluating the quality of evidence precisely helps determine the strength of recommendations in the meta-analysis. One method of evaluating the quality of evidence in non-randomized studies is the Newcastle-Ottawa Scale, provided by the Ottawa Hospital Research Institute 1) . However, we are mostly focusing on meta-analyses that use randomized studies.

If the Grading of Recommendations, Assessment, Development and Evaluations (GRADE) system ( http://www.gradeworkinggroup.org/ ) is used, the quality of evidence is evaluated on the basis of the study limitations, inaccuracies, incompleteness of outcome data, indirectness of evidence, and risk of publication bias, and this is used to determine the strength of recommendations [ 27 ]. As shown in Table 1 , the study limitations are evaluated using the “risk of bias” method proposed by Cochrane 2) . This method classifies bias in randomized studies as “low,” “high,” or “unclear” on the basis of the presence or absence of six processes (random sequence generation, allocation concealment, blinding participants or investigators, incomplete outcome data, selective reporting, and other biases) [ 28 ].

The Cochrane Collaboration’s Tool for Assessing the Risk of Bias [ 28 ]

Data extraction

Two different investigators extract data based on the objectives and form of the study; thereafter, the extracted data are reviewed. Since the size and format of each variable are different, the size and format of the outcomes are also different, and slight changes may be required when combining the data [ 29 ]. If there are differences in the size and format of the outcome variables that cause difficulties combining the data, such as the use of different evaluation instruments or different evaluation timepoints, the analysis may be limited to a systematic review. The investigators resolve differences of opinion by debate, and if they fail to reach a consensus, a third-reviewer is consulted.

Data Analysis

The aim of a meta-analysis is to derive a conclusion with increased power and accuracy than what could not be able to achieve in individual studies. Therefore, before analysis, it is crucial to evaluate the direction of effect, size of effect, homogeneity of effects among studies, and strength of evidence [ 30 ]. Thereafter, the data are reviewed qualitatively and quantitatively. If it is determined that the different research outcomes cannot be combined, all the results and characteristics of the individual studies are displayed in a table or in a descriptive form; this is referred to as a qualitative review. A meta-analysis is a quantitative review, in which the clinical effectiveness is evaluated by calculating the weighted pooled estimate for the interventions in at least two separate studies.

The pooled estimate is the outcome of the meta-analysis, and is typically explained using a forest plot ( Figs. 3 and ​ and4). 4 ). The black squares in the forest plot are the odds ratios (ORs) and 95% confidence intervals in each study. The area of the squares represents the weight reflected in the meta-analysis. The black diamond represents the OR and 95% confidence interval calculated across all the included studies. The bold vertical line represents a lack of therapeutic effect (OR = 1); if the confidence interval includes OR = 1, it means no significant difference was found between the treatment and control groups.

An external file that holds a picture, illustration, etc.
Object name is kjae-2018-71-2-103f3.jpg

Forest plot analyzed by two different models using the same data. (A) Fixed-effect model. (B) Random-effect model. The figure depicts individual trials as filled squares with the relative sample size and the solid line as the 95% confidence interval of the difference. The diamond shape indicates the pooled estimate and uncertainty for the combined effect. The vertical line indicates the treatment group shows no effect (OR = 1). Moreover, if the confidence interval includes 1, then the result shows no evidence of difference between the treatment and control groups.

An external file that holds a picture, illustration, etc.
Object name is kjae-2018-71-2-103f4.jpg

Forest plot representing homogeneous data.

Dichotomous variables and continuous variables

In data analysis, outcome variables can be considered broadly in terms of dichotomous variables and continuous variables. When combining data from continuous variables, the mean difference (MD) and standardized mean difference (SMD) are used ( Table 2 ).

Summary of Meta-analysis Methods Available in RevMan [ 28 ]

The MD is the absolute difference in mean values between the groups, and the SMD is the mean difference between groups divided by the standard deviation. When results are presented in the same units, the MD can be used, but when results are presented in different units, the SMD should be used. When the MD is used, the combined units must be shown. A value of “0” for the MD or SMD indicates that the effects of the new treatment method and the existing treatment method are the same. A value lower than “0” means the new treatment method is less effective than the existing method, and a value greater than “0” means the new treatment is more effective than the existing method.

When combining data for dichotomous variables, the OR, risk ratio (RR), or risk difference (RD) can be used. The RR and RD can be used for RCTs, quasi-experimental studies, or cohort studies, and the OR can be used for other case-control studies or cross-sectional studies. However, because the OR is difficult to interpret, using the RR and RD, if possible, is recommended. If the outcome variable is a dichotomous variable, it can be presented as the number needed to treat (NNT), which is the minimum number of patients who need to be treated in the intervention group, compared to the control group, for a given event to occur in at least one patient. Based on Table 3 , in an RCT, if x is the probability of the event occurring in the control group and y is the probability of the event occurring in the intervention group, then x = c/(c + d), y = a/(a + b), and the absolute risk reduction (ARR) = x − y. NNT can be obtained as the reciprocal, 1/ARR.

Calculation of the Number Needed to Treat in the Dichotomous table

Fixed-effect models and random-effect models

In order to analyze effect size, two types of models can be used: a fixed-effect model or a random-effect model. A fixed-effect model assumes that the effect of treatment is the same, and that variation between results in different studies is due to random error. Thus, a fixed-effect model can be used when the studies are considered to have the same design and methodology, or when the variability in results within a study is small, and the variance is thought to be due to random error. Three common methods are used for weighted estimation in a fixed-effect model: 1) inverse variance-weighted estimation 3) , 2) Mantel-Haenszel estimation 4) , and 3) Peto estimation 5) .

A random-effect model assumes heterogeneity between the studies being combined, and these models are used when the studies are assumed different, even if a heterogeneity test does not show a significant result. Unlike a fixed-effect model, a random-effect model assumes that the size of the effect of treatment differs among studies. Thus, differences in variation among studies are thought to be due to not only random error but also between-study variability in results. Therefore, weight does not decrease greatly for studies with a small number of patients. Among methods for weighted estimation in a random-effect model, the DerSimonian and Laird method 6) is mostly used for dichotomous variables, as the simplest method, while inverse variance-weighted estimation is used for continuous variables, as with fixed-effect models. These four methods are all used in Review Manager software (The Cochrane Collaboration, UK), and are described in a study by Deeks et al. [ 31 ] ( Table 2 ). However, when the number of studies included in the analysis is less than 10, the Hartung-Knapp-Sidik-Jonkman method 7) can better reduce the risk of type 1 error than does the DerSimonian and Laird method [ 32 ].

Fig. 3 shows the results of analyzing outcome data using a fixed-effect model (A) and a random-effect model (B). As shown in Fig. 3 , while the results from large studies are weighted more heavily in the fixed-effect model, studies are given relatively similar weights irrespective of study size in the random-effect model. Although identical data were being analyzed, as shown in Fig. 3 , the significant result in the fixed-effect model was no longer significant in the random-effect model. One representative example of the small study effect in a random-effect model is the meta-analysis by Li et al. [ 33 ]. In a large-scale study, intravenous injection of magnesium was unrelated to acute myocardial infarction, but in the random-effect model, which included numerous small studies, the small study effect resulted in an association being found between intravenous injection of magnesium and myocardial infarction. This small study effect can be controlled for by using a sensitivity analysis, which is performed to examine the contribution of each of the included studies to the final meta-analysis result. In particular, when heterogeneity is suspected in the study methods or results, by changing certain data or analytical methods, this method makes it possible to verify whether the changes affect the robustness of the results, and to examine the causes of such effects [ 34 ].

Heterogeneity

Homogeneity test is a method whether the degree of heterogeneity is greater than would be expected to occur naturally when the effect size calculated from several studies is higher than the sampling error. This makes it possible to test whether the effect size calculated from several studies is the same. Three types of homogeneity tests can be used: 1) forest plot, 2) Cochrane’s Q test (chi-squared), and 3) Higgins I 2 statistics. In the forest plot, as shown in Fig. 4 , greater overlap between the confidence intervals indicates greater homogeneity. For the Q statistic, when the P value of the chi-squared test, calculated from the forest plot in Fig. 4 , is less than 0.1, it is considered to show statistical heterogeneity and a random-effect can be used. Finally, I 2 can be used [ 35 ].

I 2 , calculated as shown above, returns a value between 0 and 100%. A value less than 25% is considered to show strong homogeneity, a value of 50% is average, and a value greater than 75% indicates strong heterogeneity.

Even when the data cannot be shown to be homogeneous, a fixed-effect model can be used, ignoring the heterogeneity, and all the study results can be presented individually, without combining them. However, in many cases, a random-effect model is applied, as described above, and a subgroup analysis or meta-regression analysis is performed to explain the heterogeneity. In a subgroup analysis, the data are divided into subgroups that are expected to be homogeneous, and these subgroups are analyzed. This needs to be planned in the predetermined protocol before starting the meta-analysis. A meta-regression analysis is similar to a normal regression analysis, except that the heterogeneity between studies is modeled. This process involves performing a regression analysis of the pooled estimate for covariance at the study level, and so it is usually not considered when the number of studies is less than 10. Here, univariate and multivariate regression analyses can both be considered.

Publication bias

Publication bias is the most common type of reporting bias in meta-analyses. This refers to the distortion of meta-analysis outcomes due to the higher likelihood of publication of statistically significant studies rather than non-significant studies. In order to test the presence or absence of publication bias, first, a funnel plot can be used ( Fig. 5 ). Studies are plotted on a scatter plot with effect size on the x-axis and precision or total sample size on the y-axis. If the points form an upside-down funnel shape, with a broad base that narrows towards the top of the plot, this indicates the absence of a publication bias ( Fig. 5A ) [ 29 , 36 ]. On the other hand, if the plot shows an asymmetric shape, with no points on one side of the graph, then publication bias can be suspected ( Fig. 5B ). Second, to test publication bias statistically, Begg and Mazumdar’s rank correlation test 8) [ 37 ] or Egger’s test 9) [ 29 ] can be used. If publication bias is detected, the trim-and-fill method 10) can be used to correct the bias [ 38 ]. Fig. 6 displays results that show publication bias in Egger’s test, which has then been corrected using the trim-and-fill method using Comprehensive Meta-Analysis software (Biostat, USA).

An external file that holds a picture, illustration, etc.
Object name is kjae-2018-71-2-103f5.jpg

Funnel plot showing the effect size on the x-axis and sample size on the y-axis as a scatter plot. (A) Funnel plot without publication bias. The individual plots are broader at the bottom and narrower at the top. (B) Funnel plot with publication bias. The individual plots are located asymmetrically.

An external file that holds a picture, illustration, etc.
Object name is kjae-2018-71-2-103f6.jpg

Funnel plot adjusted using the trim-and-fill method. White circles: comparisons included. Black circles: inputted comparisons using the trim-and-fill method. White diamond: pooled observed log risk ratio. Black diamond: pooled inputted log risk ratio.

Result Presentation

When reporting the results of a systematic review or meta-analysis, the analytical content and methods should be described in detail. First, a flowchart is displayed with the literature search and selection process according to the inclusion/exclusion criteria. Second, a table is shown with the characteristics of the included studies. A table should also be included with information related to the quality of evidence, such as GRADE ( Table 4 ). Third, the results of data analysis are shown in a forest plot and funnel plot. Fourth, if the results use dichotomous data, the NNT values can be reported, as described above.

The GRADE Evidence Quality for Each Outcome

N: number of studies, ROB: risk of bias, PON: postoperative nausea, POV: postoperative vomiting, PONV: postoperative nausea and vomiting, CI: confidence interval, RR: risk ratio, AR: absolute risk.

When Review Manager software (The Cochrane Collaboration, UK) is used for the analysis, two types of P values are given. The first is the P value from the z-test, which tests the null hypothesis that the intervention has no effect. The second P value is from the chi-squared test, which tests the null hypothesis for a lack of heterogeneity. The statistical result for the intervention effect, which is generally considered the most important result in meta-analyses, is the z-test P value.

A common mistake when reporting results is, given a z-test P value greater than 0.05, to say there was “no statistical significance” or “no difference.” When evaluating statistical significance in a meta-analysis, a P value lower than 0.05 can be explained as “a significant difference in the effects of the two treatment methods.” However, the P value may appear non-significant whether or not there is a difference between the two treatment methods. In such a situation, it is better to announce “there was no strong evidence for an effect,” and to present the P value and confidence intervals. Another common mistake is to think that a smaller P value is indicative of a more significant effect. In meta-analyses of large-scale studies, the P value is more greatly affected by the number of studies and patients included, rather than by the significance of the results; therefore, care should be taken when interpreting the results of a meta-analysis.

When performing a systematic literature review or meta-analysis, if the quality of studies is not properly evaluated or if proper methodology is not strictly applied, the results can be biased and the outcomes can be incorrect. However, when systematic reviews and meta-analyses are properly implemented, they can yield powerful results that could usually only be achieved using large-scale RCTs, which are difficult to perform in individual studies. As our understanding of evidence-based medicine increases and its importance is better appreciated, the number of systematic reviews and meta-analyses will keep increasing. However, indiscriminate acceptance of the results of all these meta-analyses can be dangerous, and hence, we recommend that their results be received critically on the basis of a more accurate understanding.

1) http://www.ohri.ca .

2) http://methods.cochrane.org/bias/assessing-risk-bias-included-studies .

3) The inverse variance-weighted estimation method is useful if the number of studies is small with large sample sizes.

4) The Mantel-Haenszel estimation method is useful if the number of studies is large with small sample sizes.

5) The Peto estimation method is useful if the event rate is low or one of the two groups shows zero incidence.

6) The most popular and simplest statistical method used in Review Manager and Comprehensive Meta-analysis software.

7) Alternative random-effect model meta-analysis that has more adequate error rates than does the common DerSimonian and Laird method, especially when the number of studies is small. However, even with the Hartung-Knapp-Sidik-Jonkman method, when there are less than five studies with very unequal sizes, extra caution is needed.

8) The Begg and Mazumdar rank correlation test uses the correlation between the ranks of effect sizes and the ranks of their variances [ 37 ].

9) The degree of funnel plot asymmetry as measured by the intercept from the regression of standard normal deviates against precision [ 29 ].

10) If there are more small studies on one side, we expect the suppression of studies on the other side. Trimming yields the adjusted effect size and reduces the variance of the effects by adding the original studies back into the analysis as a mirror image of each study.

Cookie consent

We use our own and third-party cookies to show you more relevant content based on your browsing and navigation history. Please accept or manage your cookie settings below. Here's our   cookie policy

Product Overview Media

  • Form Builder Signups and orders
  • Survey maker Research and feedback
  • Quiz Maker Trivia and product match
  • Find Customers Generate more leads
  • Get Feedback Discover ways to improve
  • Do research Uncover trends and ideas
  • Marketers Forms for marketing teams
  • Product Forms for product teams
  • HR Forms for HR teams
  • Customer success Forms for customer success teams
  • Business Forms for general business
  • Form templates
  • Survey templates
  • Quiz templates
  • Poll templates
  • Order forms
  • Feedback forms
  • Satisfaction surveys
  • Application forms
  • Feedback surveys
  • Evaluation forms
  • Request forms
  • Signup forms
  • Business surveys
  • Marketing surveys
  • Report forms
  • Customer feedback form
  • Registration form
  • Branding questionnaire
  • 360 feedback
  • Lead generation
  • Contact form
  • Signup sheet

Slack Menu Icon

  • Help center Find quick answers
  • Contact us Speak to someone
  • Our blog Get inspired
  • Our community Share and learn
  • Our guides Tips and how-to
  • Updates News and announcements
  • Brand Our guidelines
  • Partners Browse or join
  • Careers Join our team
  • → The 8 types of market research and ho...

The 8 types of market research and how to use them

There are eight types of marketing research you can try to stay ahead of the competition. Learn more about marketing research methods and how to use them.

Person conducting different types of market research.

Latest posts on Tips

Typeform    |    05.2024

Typeform    |    04.2024

“If you keep doing what you’ve always done, you’ll keep getting what you’ve always got.”

Doesn’t sound too threatening if you’ve always been successful, right?

Continuing to do what you’ve always done means you’ll fall behind—and probably fade to darkness—to where all the forgotten brands go.

Take Kodak. They were a major player in photography for decades—remember? When digital photography boomed, Kodak kept doing what they always did. Their business floundered and people forgot about them. Well, everyone apart from Pitbull.

Now, look at Fujifilm, one of Kodak’s biggest competitors. They did the opposite and looked for ways to apply their expertise in film to the technology of the new millennium instead. Their company is still going strong.

The same goes for research. If you’re doing the same old types of market research, speaking to the same old people, and doing the same old tired surveys—you’re already behind.

How do you decide what kind of market research you need to do? It all comes down to what you need to know and what your business goals are.

In this article, we’ll explain the various types of market research you can use to solve issues and challenges in your business. We’ll throw you a freebie, too, and provide some market research tips about when to use each strategy.

Let’s get you ahead of the curve.

1. Brand research

A person conducting brand market research.

Brand research helps with creating and managing a company’s brand, or identity. A company’s brand is the images, narratives, and characteristics people associate with it.

When to use it

Brand research can be used at every stage in a business’s lifecycle, from creation to new product launches and re-branding. There are at least seven types of brand research:

Brand advocacy: How many of your customers are willing to recommend your brand?

Brand awareness : Does your target market know who you are and consider you a serious option?

Brand loyalty: Are you retaining customers?

Brand penetration: What is the proportion of your target market using your brand?

Brand perception : What do people think of as your company’s identity or differentiating qualities?

Brand positioning: What is the best way to differentiate your brand from others in the consumer’s mind and articulate it in a way that resonates?

Brand value: How much are people willing to pay for an experience with your brand over another?

How to do it

A researcher will use several types of market research methods to assess your and your competitors’ strengths and weaknesses. Generally, they will conduct competitor research, both qualitative and quantitative, to get a picture of the overall marketplace. Focus groups and interviews can be used to learn about their emotions and associations with certain brands.

Market research surveys are useful to determine features and benefits that differentiate you from competitors . These are then translated into emotionally compelling consumer language.

2. Campaign effectiveness

This type of market research is designed to evaluate whether your advertising messages are reaching the right people and delivering the desired results. Successful campaign effectiveness research can help you sell more and reduce customer acquisition costs.

It’s estimated people see up to 5,000 advertising messages each day. That means attention is a scarce resource, so campaign effectiveness research should be used when you need to spend your advertising dollars effectively.

Campaign effectiveness research depends on which stage of the campaign you use it in (ideally, it’s all of them!). Quantitative research can be conducted to provide a picture of how your target market views advertising and address weaknesses in the advertising campaign.

3. Competitive analysis

Different companies are conducting competitor analysis.

Competitive analysis allows you to assess your competitors’ strengths and weaknesses in the marketplace, providing you with fuel to drive a competitive advantage.

No business exists in a vacuum—competitive analysis is an integral part of any business and market plan. Whether you’re just getting started, moving into a new market, or doing a health check of your business, a competitive analysis will be invaluable.

A researcher will typically choose a few of your main competitors and analyze things like their marketing strategy, customer perceptions, revenue or sales volume, and so on.

Secondary sources such as articles, references, and advertising are excellent sources of competitive information; however, primary research, such as mystery shopping and focus groups, can offer valuable information on customer service and current consumer opinions.

4. Consumer insights

Consumer insights research does more than tell you about who your customers are and what they do. It reveals why customers behave in certain ways and helps you leverage that to meet your business goals.

Knowing your customers deeply is integral to creating a strategic marketing plan. This type of market research can help you anticipate consumer needs, spark innovation, personalize your marketing, solve business challenges, and more.

Consumer insights research should be specific to your business—it’s about getting to know your target audience and customers. Various market research methods can be used, such as interviews, ethnography, survey research, social monitoring, and customer journey research.

Here are some of the characteristics you should understand through consumer insights research:

Purchase habits

Interests, hobbies, passions

Personal and professional information

How they consume media and advertising

5. Customer satisfaction research

Customer satisfaction research is a type of market research that measures customers’ experiences with products or services, specifically looking at how those meet, exceed, or fail to live up to their expectations.

Customer satisfaction is a strong indicator of customer retention and overall business performance. Successful customer satisfaction research should help you understand what your customers like, dislike, and feel needs improvement. You can use this type of market research to look at the quality and design of products, speed and timeliness of delivery, staff and service reliability, knowledge, and friendliness, market price, and value for money.

There are several ways to measure customer satisfaction, most commonly using surveys. An NPS or Voice of the Customer Survey can help you measure customer loyalty. Customer Effort Scoring measures how satisfied people are with customer service or problem resolution. CSAT is any survey that measures customer satisfaction , typically measured using Likert scale surveys . They can be conducted at different points in the customer experience, allowing deeper insight into that moment.

6. Customer segmentation research

People conducting market research.

Customer segmentation studies aim to divide markets or customers into smaller groups or personas with similar characteristics to enable targeted marketing. By understanding how people in each category behave, you can understand how each influences revenue.

Customer segmentation research is best used if you’re ready to give customers individualized experiences. Not every customer in your target market is the same. The more you understand each specific persona, the easier it is to focus on delivering personalized marketing, build loyal relations, price products effectively, and forecast how new products and services will perform in each segment.

Market researchers use four characteristics to segment customers.

Demographics: demographic information such as age, gender, family status, education, household income, occupation and so on

Geography: where people live, from cities and countries to whether they are city dwellers or suburbanites

Psychographics: socioeconomic status, class, lifestyle, personality traits, generation, interests, hobbies, etc.

Behavior: brand affinity, consumption and shopping habits, spending, etc.

A researcher will identify your current customers and collect data about them through various market research methods, such as surveys, database research, website analytics, interviews, and focus groups. The aim is to gather as much information as possible.

7. Product development

Market research for product development involves using customer knowledge to inform the entire process of creating or improving a product, service, or app and bringing it to market.

Innovation is hard work. A quick Google will tell you that 80–95% of new products fail every year. Conducting market research for product and app development helps minimize the risk of a new product or change going bust as it enters the market. There are three stages where you can use market research:

Conception: The moment you’re thinking about adding something new, market research can find market opportunities and provide insights into customer challenges or their jobs-to-be-done, so you can find a way to fill the gap.

Formation: Once you have an idea, market researchers can help you turn it into a concept that can be tested. You can learn more about strategizing pricing, testing advertising and packaging, value proposition, and so on.

Introduction: Market research can help you gauge attitudes toward the product once it’s in the market and adapt your messaging as it rolls out.

Keep making the product better or find opportunities to introduce it to new markets.

Product development research will utilize different market research methods, depending on the goal of the research. A researcher could present focus groups with product concepts and listen to their opinions, conduct interviews to learn more about their pain points, or perform user testing to see how they interact with an app or website.

8. Usability testing

Usability testing is concerned with understanding how customers use your products in real time. It can involve physical products, like a new blender, or digital products like a website or app.

Usability testing is helpful when you need to detect problems or bugs in early prototypes or beta versions before launching them. It typically costs far less to test a product or service beforehand than to pull a flawed product off the shelves or lose sales because of poor functionality.

There are several types of usability tests, which vary based on whether you’re testing a physical or digital product.

Journey testing involves observing the customer experience on an app or website and monitoring how they perform. This type of study can be done online

Eye tracking studies monitor where people’s eyes are drawn. Generally, they are conducted on websites and apps, but can also be done in stores to analyze where people look while shopping

Learn ability studies quantify the learning curve over time to see which problems people encounter after repeating the same task

Click tracking follows users’ activity on websites to evaluate the linking structure of a website

Checklist testing involves giving users tasks to perform and recording or asking them to review their experience

Combining types of market research with Typeform

When it comes to market research, you need to ask yourself what business challenge or question you’re trying to address. Then, select the appropriate methods and tools, such as market research automation , to simplify your process.From there, the world of useful data and actionable insights will open to you.

The author Typeform

About the author

We're Typeform - a team on a mission to transform data collection by bringing you refreshingly different forms.

Liked that? Check these out:

type of research analysis

Zero-party data: a beginner’s guide

What is zero-party data? And how does it affect your business? Explore the evolution of data collection in a cookieless era and stay ahead in the changing landscape of digital marketing.

Typeform    |    01.2024

type of research analysis

5 landing page tests to run on your website from day one

You’ve got visitors, and now you need to turn them into customers. It start with knowing their needs. Here’s five tests you should be running on your landing pages.

Jessica Thiefels    |    03.2019

type of research analysis

Market research automation: A marketer’s guide for 2024

Market research automation uses AI to simplify and enhance traditional research. AI helps interpret data, create data visualizations, pull insights, and more.

Kevin Branscum    |    11.2023

  • Open access
  • Published: 13 May 2024

Patient medication management, understanding and adherence during the transition from hospital to outpatient care - a qualitative longitudinal study in polymorbid patients with type 2 diabetes

  • Léa Solh Dost   ORCID: orcid.org/0000-0001-5767-1305 1 , 2 ,
  • Giacomo Gastaldi   ORCID: orcid.org/0000-0001-6327-7451 3 &
  • Marie P. Schneider   ORCID: orcid.org/0000-0002-7557-9278 1 , 2  

BMC Health Services Research volume  24 , Article number:  620 ( 2024 ) Cite this article

32 Accesses

Metrics details

Continuity of care is under great pressure during the transition from hospital to outpatient care. Medication changes during hospitalization may be poorly communicated and understood, compromising patient safety during the transition from hospital to home. The main aims of this study were to investigate the perspectives of patients with type 2 diabetes and multimorbidities on their medications from hospital discharge to outpatient care, and their healthcare journey through the outpatient healthcare system. In this article, we present the results focusing on patients’ perspectives of their medications from hospital to two months after discharge.

Patients with type 2 diabetes, with at least two comorbidities and who returned home after discharge, were recruited during their hospitalization. A descriptive qualitative longitudinal research approach was adopted, with four in-depth semi-structured interviews per participant over a period of two months after discharge. Interviews were based on semi-structured guides, transcribed verbatim, and a thematic analysis was conducted.

Twenty-one participants were included from October 2020 to July 2021. Seventy-five interviews were conducted. Three main themes were identified: (A) Medication management, (B) Medication understanding, and (C) Medication adherence, during three periods: (1) Hospitalization, (2) Care transition, and (3) Outpatient care. Participants had varying levels of need for medication information and involvement in medication management during hospitalization and in outpatient care. The transition from hospital to autonomous medication management was difficult for most participants, who quickly returned to their routines with some participants experiencing difficulties in medication adherence.

Conclusions

The transition from hospital to outpatient care is a challenging process during which discharged patients are vulnerable and are willing to take steps to better manage, understand, and adhere to their medications. The resulting tension between patients’ difficulties with their medications and lack of standardized healthcare support calls for interprofessional guidelines to better address patients’ needs, increase their safety, and standardize physicians’, pharmacists’, and nurses’ roles and responsibilities.

Peer Review reports

Introduction

Continuity of patient care is characterized as the collaborative engagement between the patient and their physician-led care team in the ongoing management of healthcare, with the mutual objective of delivering high-quality and cost-effective medical care [ 1 ]. Continuity of care is under great pressure during the transition of care from hospital to outpatient care, with a risk of compromising patients’ safety [ 2 , 3 ]. The early post-discharge period is a high-risk and fragile transition: once discharged, one in five patients experience at least one adverse event during the first three weeks following discharge, and more than half of these adverse events are drug-related [ 4 , 5 ]. A retrospective study examining all discharged patients showed that adverse drug events (ADEs) account for up to 20% of 30-day hospital emergency readmissions [ 6 ]. During hospitalization, patients’ medications are generally modified, with an average of nearly four medication changes per patient [ 7 ]. Information regarding medications such as medication changes, the expected effect, side effects, and instructions for use are frequently poorly communicated to patients during hospitalization and at discharge [ 8 , 9 , 10 , 11 ]. Between 20 and 60% of discharged patients lack knowledge of their medications [ 12 , 13 ]. Consideration of patients’ needs and their active engagement in decision-making during hospitalization regarding their medications are often lacking [ 11 , 14 , 15 ]. This can lead to unsafe discharge and contribute to medication adherence difficulties, such as non-implementation of newly prescribed medications [ 16 , 17 ].

Patients with multiple comorbidities and polypharmacy are at higher risk of ADE [ 18 ]. Type 2 diabetes is one of the chronic health conditions most frequently associated with comorbidities and patients with type 2 diabetes often lack care continuum [ 19 , 20 , 21 ]. The prevalence of patients hospitalized with type 2 diabetes can exceed 40% [ 22 ] and these patients are at higher risk for readmission due to their comorbidities and their medications, such as insulin and oral hypoglycemic agents [ 23 , 24 , 25 ].

Interventions and strategies to improve patient care and safety at transition have shown mixed results worldwide in reducing cost, rehospitalization, ADE, and non-adherence [ 26 , 27 , 28 , 29 , 30 , 31 , 32 , 33 , 34 , 35 ]. However, interventions that are patient-centered, with a patient follow-up and led by interprofessional healthcare teams showed promising results [ 34 , 35 , 36 ]. Most of these interventions have not been implemented routinely due to the extensive time to translate research into practice and the lack of hybrid implementation studies [ 37 , 38 , 39 , 40 , 41 ]. In addition, patient-reported outcomes and perspectives have rarely been considered, yet patients’ involvement is essential for seamless and integrated care [ 42 , 43 ]. Interprofessional collaboration in which patients are full members of the interprofessional team, is still in its infancy in outpatient care [ 44 ]. Barriers and facilitators regarding medications at the transition of care have been explored in multiple qualitative studies at one given time in a given setting (e.g., at discharge, one-month post-discharge) [ 8 , 45 , 46 , 47 , 48 ]. However, few studies have adopted a holistic methodology from the hospital to the outpatient setting to explore changes in patients’ perspectives over time [ 49 , 50 , 51 ]. Finally, little is known about whether, how, and when patients return to their daily routine following hospitalization and the impact of hospitalization weeks after discharge.

In Switzerland, continuity of care after hospital discharge is still poorly documented, both in terms of contextual analysis and interventional studies, and is mainly conducted in the hospital setting [ 31 , 35 , 52 , 53 , 54 , 55 , 56 ]. The first step of an implementation science approach is to perform a contextual analysis to set up effective interventions adapted to patients’ needs and aligned to healthcare professionals’ activities in a specific context [ 41 , 57 ]. Therefore, the main aims of this study were to investigate the perspectives of patients with type 2 diabetes and multimorbidities on their medications from hospital discharge to outpatient care, and on their healthcare journey through the outpatient healthcare system. In this article, we present the results focusing on patients’ perspectives of their medications from hospital to two months after discharge.

Study design

This qualitative longitudinal study, conducted from October 2020 to July 2021, used a qualitative descriptive methodology through four consecutive in-depth semi-structured interviews per participant at three, 10-, 30- and 60-days post-discharge, as illustrated in Fig.  1 . Longitudinal qualitative research is characterized by qualitative data collection at different points in time and focuses on temporality, such as time and change [ 58 , 59 ]. Qualitative descriptive studies aim to explore and describe the depth and complexity of human experiences or phenomena [ 60 , 61 , 62 ]. We focused our qualitative study on the 60 first days after discharge as this period is considered highly vulnerable and because studies often use 30- or 60-days readmission as an outcome measure [ 5 , 63 ].

This qualitative study follows the Consolidated Criteria for Reporting Qualitative Research (COREQ). Ethics committee approval was sought and granted by the Cantonal Research Ethics Commission, Geneva (CCER) (2020 − 01779).

Recruitment took place during participants’ hospitalization in the general internal medicine divisions at the Geneva University Hospitals in the canton of Geneva (500 000 inhabitants), Switzerland. Interviews took place at participants’ homes, in a private office at the University of Geneva, by telephone or by secure video call, according to participants’ preference. Informal caregivers could also participate alongside the participants.

figure 1

Study flowchart

Researcher characteristics

All the researchers were trained in qualitative studies. The diabetologist and researcher (GG) who enrolled the patients in the study was involved directly or indirectly (advice asked to the Geneva University Hospital diabetes team of which he was a part) for most participants’ care during hospitalization. LS (Ph.D. student and community pharmacist) was unknown to participants and presented herself during hospitalization as a “researcher” and not as a healthcare professional to avoid any risk of influencing participants’ answers. This study was not interventional, and the interviewer (LS) invited participants to contact a healthcare professional for any questions related to their medication or medical issues.

Population and sampling strategy

Patients with type 2 diabetes were chosen as an example population to describe polypharmacy patients as these patients usually have several health issues and polypharmacy [ 20 , 22 , 25 ]. Inclusions criteria for the study were: adult patients with type 2 diabetes, with at least two other comorbidities, hospitalized for at least three days in a general internal medicine ward, with a minimum of one medication change during hospital stay, and who self-managed their medications once discharged home. Exclusion criteria were patients not reachable by telephone following discharge, unable to give consent (patients with schizophrenia, dementia, brain damage, or drug/alcohol misuse), and who could not communicate in French. A purposive sampling methodology was applied aiming to include participants with different ages, genders, types, and numbers of health conditions by listing participants’ characteristics in a double-entry table, available in Supplementary Material 1 , until thematic saturation was reached. Thematic saturation was considered achieved when no new code or theme emerged and new data repeated previously coded information [ 64 ]. The participants were identified if they were hospitalized in the ward dedicated to diabetes care or when the diabetes team was contacted for advice. The senior ward physician (GG) screened eligible patients and the interviewer (LS) obtained written consent before hospital discharge.

Data collection and instruments

Sociodemographic (age, gender, educational level, living arrangement) and clinical characteristics (reason for hospitalization, date of admission, health conditions, diabetes diagnosis, medications before and during hospitalization) were collected by interviewing participants before their discharge and by extracting participants’ data from electronic hospital files by GG and LS. Participants’ pharmacies were contacted with the participant’s consent to obtain medication records from the last three months if information regarding medications before hospitalization was missing in the hospital files.

Semi-structured interview guides for each interview (at three, 10-, 30- and 60-days post-discharge) were developed based on different theories and components of health behavior and medication adherence: the World Health Organization’s (WHO) five dimensions for adherence, the Information-Motivation-Behavioral skills model and the Social Cognitive Theory [ 65 , 66 , 67 ]. Each interview explored participants’ itinerary in the healthcare system and their perspectives on their medications. Regarding medications, the following themes were mentioned at each interview: changes in medications, patients’ understanding and implication; information on their medications, self-management of their medications, and patients’ medication adherence. Other aspects were mentioned in specific interviews: patients’ hospitalization and experience on their return home (interview 1), motivation (interviews 2 and 4), and patient’s feedback on the past two months (interview 4). Interview guides translated from French are available in Supplementary Material 2 . The participants completed self-reported and self-administrated questionnaires at different interviews to obtain descriptive information on different factors that may affect medication management and adherence: self-report questionnaires on quality of life (EQ-5D-5 L) [ 68 ], literacy (Schooling-Opinion-Support questionnaire) [ 69 ], medication adherence (Adherence Visual Analogue Scale, A-VAS) [ 70 ] and Belief in Medication Questionnaire (BMQ) [ 71 ] were administered to each participant at the end of selected interviews to address the different factors that may affect medication management and adherence as well as to determine a trend of determinants over time. The BMQ contains two subscores: Specific-Necessity and Specific-Concerns, addressing respectively their perceived needs for their medications, and their concerns about adverse consequences associated with taking their medication [ 72 ].

Data management

Informed consent forms, including consent to obtain health data, were securely stored in a private office at the University of Geneva. The participants’ identification key was protected by a password known only by MS and LS. Confidentiality was guaranteed by pseudonymization of participants’ information and audio-recordings were destroyed once analyzed. Sociodemographic and clinical characteristics, medication changes, and answers to questionnaires were securely collected by electronic case report forms (eCRFs) on RedCap®. Interviews were double audio-recorded and field notes were taken during interviews. Recorded interviews were manually transcribed verbatim in MAXQDA® (2018.2) by research assistants and LS and transcripts were validated for accuracy by LS. A random sample of 20% of questionnaires was checked for accuracy for the transcription from the paper questionnaires to the eCRFs. Recorded sequences with no link to the discussed topics were not transcribed and this was noted in the transcripts.

Data analysis

A descriptive statistical analysis of sociodemographic, clinical characteristics and self-reported questionnaire data was carried out. A thematic analysis of transcripts was performed, as described by Braun and Clarke [ 73 ], by following six steps: raw data was read, text segments related to the study objectives were identified, text segments to create new categories were identified, similar or redundant categories were reduced and a model that integrated all significant categories was created. The analysis was conducted in parallel with patient enrolment to ensure data saturation. To ensure the validity of the coding method, transcripts were double coded independently and discussed by the research team until similar themes were obtained. The research group developed and validated an analysis grid, with which LS coded systematically the transcriptions and met regularly with the research team to discuss questions on data analysis and to ensure the quality of coding. The analysis was carried out in French, and the verbatims of interest cited in the manuscript were translated and validated by a native English-speaking researcher to preserve the meaning.

In this analysis, we used the term “healthcare professionals” when more than one profession could be involved in participants’ medication management. Otherwise, when a specific healthcare professional was involved, we used the designated profession (e.g. physicians, pharmacists).

Patient and public involvement

During the development phase of the study, interview guides and questionnaires were reviewed for clarity and validity and adapted by two patient partners, with multiple health conditions and who experienced previously a hospital discharge. They are part of the HUG Patients Partners + 3P platform for research and patient and public involvement.

Interviews and participants’ descriptions

A total of 75 interviews were conducted with 21 participants. In total, 31 patients were contacted, seven refused to participate (four at the project presentation and three at consent), two did not enter the selection criteria at discharge and one was unreachable after discharge. Among the 21 participants, 15 participated in all interviews, four in three interviews, one in two interviews, and one in one interview, due to scheduling constraints. Details regarding interviews and participants characteristics are presented in Tables  1 and 2 .

The median length of time between hospital discharge and interviews 1,2,3 and 4 was 5 (IQR: 4–7), 14 (13-20), 35 (22-38), and 63 days (61-68), respectively. On average, by comparing medications at hospital admission and discharge, a median of 7 medication changes (IQR: 6–9, range:2;17) occurred per participant during hospitalization and a median of 7 changes (5–12) during the two months following discharge. Details regarding participants’ medications are described in Table  3 .

Patient self-reported adherence over the past week for their three most challenging medications are available in Supplementary Material 3 .

Qualitative analysis

We defined care transition as the period from discharge until the first medical appointment post-discharge, and outpatient care as the period starting after the first medical appointment. Data was organized into three key themes (A. Medication management, B. Medication understanding, and C. Medication adherence) divided into subthemes at three time points (1. Hospitalization, 2. Care transition and 3. Outpatient care). Figure  2 summarizes and illustrates the themes and subthemes with their influencing factors as bullet points.

figure 2

Participants’ medication management, understanding and adherence during hospitalization, care transition and outpatient care

A. Medication management

A.1 medication management during hospitalization: medication management by hospital staff.

Medications during hospitalization were mainly managed by hospital healthcare professionals (i.e. nurses and physicians) with varying degrees of patient involvement: “At the hospital, they prepared the medications for me. […] I didn’t even know what the packages looked like.” Participant 22; interview 1 (P22.1) Some participants reported having therapeutic education sessions with specialized nurses and physicians, such as the explanation and demonstration of insulin injection and glucose monitoring. A patient reported that he was given the choice of several treatments and was involved in shared decision-making. Other participants had an active role in managing and optimizing dosages, such as rapid insulin, due to prior knowledge and use of medications before hospitalization.

A.2 Medication management at transition: obtaining the medication and initiating self-management

Once discharged, some participants had difficulties obtaining their medications at the pharmacy because some medications were not stored and had to be ordered, delaying medication initiation. To counter this problem upstream, a few participants were provided a 24-to-48-hour supply of medications at discharge. It was sometimes requested by the patient or suggested by the healthcare professionals but was not systematic. The transition from medication management by hospital staff to self-management was exhausting for most participants who were faced with a large amount of new information and changes in their medications: “ When I was in the hospital, I didn’t even realize all the changes. When I came back home, I took away the old medication packages and got out the new ones. And then I thought : « my God, all this…I didn’t know I had all these changes » ” P2.1 Written documentation, such as the discharge prescription or dosage labels on medication packages, was helpful in managing their medication at home. Most participants used weekly pill organizers to manage their medications, which were either already used before hospitalization or were introduced post-discharge. The help of a family caregiver in managing and obtaining medications was reported as a facilitator.

A.3 Medication management in outpatient care: daily self-management and medication burden

A couple of days or weeks after discharge, most participants had acquired a routine so that medication management was less demanding, but the medication burden varied depending on the participants. For some, medication management became a simple action well implemented in their routine (“It has become automatic” , P23.4), while for others, the number of medications and the fact that the medications reminded them of the disease was a heavy burden to bear on a daily basis (“ During the first few days after getting out of the hospital, I thought I was going to do everything right. In the end, well [laughs] it’s complicated. I ended up not always taking the medication, not monitoring the blood sugar” P12.2) To support medication self-management, some participants had written documentation such as treatment plans, medication lists, and pictures of their medication packages on their phones. Some participants had difficulties obtaining medications weeks after discharge as discharge prescriptions were not renewable and participants did not see their physician in time. Others had to visit multiple physicians to have their prescriptions updated. A few participants were faced with prescription or dispensing errors, such as prescribing or dispensing the wrong dosage, which affected medication management and decreased trust in healthcare professionals. In most cases, according to participants, the pharmacy staff worked in an interprofessional collaboration with physicians to provide new and updated prescriptions.

B. Medication understanding

B.1 medication understanding during hospitalization: new information and instructions.

The amount of information received during hospitalization varied considerably among participants with some reporting having received too much, while others saying they received too little information regarding medication changes, the reason for changes, or for introducing new medications: “They told me I had to take this medication all my life, but they didn’t tell me what the effects were or why I was taking it.” P5.3

Hospitalization was seen by some participants as a vulnerable and tiring period during which they were less receptive to information. Information and explanations were generally given verbally, making it complicated for most participants to recall it. Some participants reported that hospital staff was attentive to their needs for information and used communication techniques such as teach-back (a way of checking understanding by asking participants to say in their own words what they need to know or do about their health or medications). Some participants were willing to be proactive in the understanding of their medications while others were more passive, had no specific needs for information, and did not see how they could be engaged more.

B.2 Medication understanding at transition: facing medication changes

At hospital discharge, the most challenging difficulty for participants was to understand the changes made regarding their medications. For newly diagnosed participants, the addition of new medications was more difficult to understand, whereas, for experienced participants, changes in known medications such as dosage modification, changes within a therapeutic class, and generic substitutions were the most difficult to understand. Not having been informed about changes caused confusion and misunderstanding. Therefore, medication reconciliation done by the patient was time-consuming, especially for participants with multiple medications: “ They didn’t tell me at all that they had changed my treatment completely. They just told me : « We’ve changed a few things. But it was the whole treatment ». ” P2.3 Written information, such as the discharge prescription, the discharge report (brief letter summarizing information about the hospitalization, given to the patient at discharge), or the label on the medication box (written by the pharmacist with instructions on dosage) helped them find or recall information about their medications and diagnoses. However, technical terms were used in hospital documentations and were not always understandable. For example, this participant said: “ On the prescription of valsartan, they wrote: ‘resume in the morning once profile…’[once hypertension profile allows]… I don’t know what that means.” P8.1 In addition, some documents were incomplete, as mentioned by a patient who did not have the insulin dosage mentioned on the hospital prescription. Some participants sought help from healthcare professionals, such as pharmacists, hospital physicians, or general practitioners a few days after discharge to review medications, answer questions, or obtain additional information.

B.3 Medication understanding in the outpatient care: concerns and knowledge

Weeks after discharge, most participants had concerns about the long-term use of their medications, their usefulness, and the possible risk of interactions or side effects. Some participants also reported having some lack of knowledge regarding indications, names, or how the medication worked: “I don’t even know what Brilique® [ticagrelor, antiplatelet agent] is for. It’s for blood pressure, isn’t it?. I don’t know.” P11.4 According to participants, the main reasons for the lack of understanding were the lack of information at the time of prescribing and the large number of medications, making it difficult to search for information and remember it. Participants sought information from different healthcare professionals or by themselves, on package inserts, through the internet, or from family and friends. Others reported having had all the information needed or were not interested in having more information. In addition, participants with low medication literacy, such as non-native speakers or elderly people, struggled more with medication understanding and sought help from family caregivers or healthcare professionals, even weeks after discharge: “ I don’t understand French very well […] [The doctor] explained it very quickly…[…] I didn’t understand everything he was saying” P16.2

C. Medication adherence

C.2 medication adherence at transition: adopting new behaviors.

Medication adherence was not mentioned as a concern during hospitalization and a few participants reported difficulties in medication initiation once back home: “I have an injection of Lantus® [insulin] in the morning, but obviously, the first day [after discharge], I forgot to do it because I was not used to it.” P23.1 Participants had to quickly adopt new behaviors in the first few days after discharge, especially for participants with few medications pre-hospitalization. The use of weekly pill organizers, alarms and specific storage space were reported as facilitators to support adherence. One patient did not initiate one of his medications because he did not understand the medication indication, and another patient took her old medications because she was used to them. Moreover, most participants experienced their hospitalization as a turning point, a time when they focused on their health, thought about the importance of their medications, and discussed any new lifestyle or dietary measures that might be implemented.

C.3 Medication adherence in outpatient care: ongoing medication adherence

More medication adherence difficulties appeared a few weeks after hospital discharge when most participants reported nonadherence behaviors, such as difficulties implementing the dosage regimen, or intentionally discontinuing the medication and modifying the medication regimen on their initiative. Determinants positively influencing medication adherence were the establishment of a routine; organizing medications in weekly pill-organizers; organizing pocket doses (medications for a short period that participants take with them when away from home); seeking support from family caregivers; using alarm clocks; and using specific storage places. Reasons for nonadherence were changes in daily routine; intake times that were not convenient for the patient; the large number of medications; and poor knowledge of the medication or side effects. Healthcare professionals’ assistance for medication management, such as the help of home nurses or pharmacists for the preparation of weekly pill-organizers, was requested by participants or offered by healthcare professionals to support medication adherence: “ I needed [a home nurse] to put my pills in the pillbox. […] I felt really weak […] and I was making mistakes. So, I’m very happy [the doctor] offered me [home care]. […] I have so many medications.” P22.3 Some participants who experienced prehospitalization non-adherence were more aware of their non-adherence and implemented strategies, such as modifying the timing of intake: “I said to my doctor : « I forget one time out of two […], can I take them in the morning? » We looked it up and yes, I can take it in the morning.” P11.2 In contrast, some participants were still struggling with adherence difficulties that they had before hospitalization. Motivations for taking medications two months after discharge were to improve health, avoid complications, reduce symptoms, reduce the number of medications in the future or out of obligation: “ I force myself to take them because I want to get to the end of my diabetes, I want to reduce the number of pills as much as possible.” P14.2 After a few weeks post-hospitalization, for some participants, health and illness were no longer the priority because of other life imperatives (e.g., family or financial situation).

This longitudinal study provided a multi-faceted representation of how patients manage, understand, and adhere to their medications from hospital discharge to two months after discharge. Our findings highlighted the varying degree of participants’ involvement in managing their medications during their hospitalization, the individualized needs for information during and after hospitalization, the complicated transition from hospital to autonomous medication management, the adaptation of daily routines around medication once back home, and the adherence difficulties that surfaced in the outpatient care, with nonadherence prior to hospitalization being an indicator of the behavior after discharge. Finally, our results confirmed the lack of continuity in care and showed the lack of patient care standardization experienced by the participants during the transition from hospital to outpatient care.

This in-depth analysis of patients’ experiences reinforces common challenges identified in the existing literature such as the lack of personalized information [ 9 , 10 , 11 ], loss of autonomy during hospitalization [ 14 , 74 , 75 ], difficulties in obtaining medication at discharge [ 11 , 45 , 76 ] and challenges in understanding treatment modifications and generics substitution [ 11 , 32 , 77 , 78 ]. Some of these studies were conducted during patients’ hospitalization [ 10 , 75 , 79 ] or up to 12 months after discharge [ 80 , 81 ], but most studies focused on the few days following hospital discharge [ 9 , 11 , 14 , 82 ]. Qualitative studies on medications at transition often focused on a specific topic, such as medication information, or a specific moment in time, and often included healthcare professionals, which muted patients’ voices [ 9 , 10 , 11 , 47 , 49 ]. Our qualitative longitudinal methodology was interested in capturing the temporal dynamics, in-depth narratives, and contextual nuances of patients’ medication experiences during transitions of care [ 59 , 83 ]. This approach provided a comprehensive understanding of how patients’ perspectives and behaviors evolved over time, offering insights into the complex interactions of medication management, understanding and adherence, and turning points within their medication journeys. A qualitative longitudinal design was used by Fylan et al. to underline patients’ resilience in medication management during and after discharge, by Brandberg et al. to show the dynamic process of self-management during the 4 weeks post-discharge and by Lawton et al. to examine how patients with type 2 diabetes perceived their care after discharge over a period of four years [ 49 , 50 , 51 ]. Our study focused on the first two months following hospitalization and future studies should focus on following discharged and at-risk patients over a longer period, as “transitions of care do not comprise linear trajectories of patients’ movements, with a starting and finishing point. Instead, they are endless loops of movements” [ 47 ].

Our results provide a particularly thorough description of how participants move from a state of total dependency during hospitalization regarding their medication management to a sudden and complete autonomy after hospital discharge impacting medication management, understanding, and adherence in the first days after discharge for some participants. Several qualitative studies have described the lack of shared decision-making and the loss of patient autonomy during hospitalization, which had an impact on self-management and created conflicts with healthcare professionals [ 75 , 81 , 84 ]. Our study also highlights nuanced patient experiences, including varying levels of patient needs, involvement, and proactivity during hospitalization and outpatient care, and our results contribute to capturing different perspectives that contrast with some literature that often portrays patients as more passive recipients of care [ 14 , 15 , 74 , 75 ]. Shared decision-making and proactive medication are key elements as they contribute to a smoother transition and better outcomes for patients post-discharge [ 85 , 86 , 87 ].

Consistent with the literature, the study identifies some challenges in medication initiation post-discharge [ 16 , 17 , 88 ] but our results also describe how daily routine rapidly takes over, either solidifying adherence behavior or generating barriers to medication adherence. Participants’ nonadherence prior to hospitalization was a factor influencing participants’ adherence post-hospitalization and this association should be further investigated, as literature showed that hospitalized patients have high scores of non-adherence [ 89 ]. Mortel et al. showed that more than 20% of discharged patients stopped their medications earlier than agreed with the physician and 25% adapted their medication intake [ 90 ]. Furthermore, patients who self-managed their medications had a lower perception of the necessity of their medication than patients who received help, which could negatively impact medication adherence [ 91 ]. Although participants in our study had high BMQ scores for necessity and lower scores for concerns, some participants expressed doubts about the need for their medications and a lack of motivation a few weeks after discharge. Targeted pharmacy interventions for newly prescribed medications have been shown to improve medication adherence, and hospital discharge is an opportune moment to implement this service [ 92 , 93 ].

Many medication changes were made during the transition of care (a median number of 7 changes during hospitalization and 7 changes during the two months after discharge), especially medication additions during hospitalization and interruptions after hospitalization. While medication changes during hospitalization are well described, the many changes following discharge are less discussed [ 7 , 94 ]. A Danish study showed that approximately 65% of changes made during hospitalization were accepted by primary healthcare professionals but only 43% of new medications initiated during hospitalization were continued after discharge [ 95 ]. The numerous changes after discharge may be caused by unnecessary intensification of medications during hospitalization, delayed discharge letters, lack of standardized procedures, miscommunication, patient self-management difficulties, or in response to an acute situation [ 96 , 97 , 98 ]. During the transition of care, in our study, both new and experienced participants were faced with difficulties in managing and understanding medication changes, either for newly prescribed medication or changes in previous medications. Such difficulties corroborate the findings of the literature [ 9 , 10 , 47 ] and our results showed that the lack of understanding during hospitalization led to participants having questions about their medications, even weeks after discharge. Particular attention should be given to patients’ understanding of medication changes jointly by physicians, nurses and pharmacists during the transition of care and in the months that follow as medications are likely to undergo as many changes as during hospitalization.

Implication for practice and future research

The patients’ perspectives in this study showed, at a system level, that there was a lack of standardization in healthcare professional practices regarding medication dispensing and follow-up. For now, in Switzerland, there are no official guidelines on medication prescription and dispensation during the transition of care although some international guidelines have been developed for outpatient healthcare professionals [ 3 , 99 , 100 , 101 , 102 ]. Here are some suggestions for improvement arising from our results. Patients should be included as partners and healthcare professionals should systematically assess (i) previous medication adherence, (ii) patients’ desired level of involvement and (iii) their needs for information during hospitalization. Hospital discharge processes should be routinely implemented to standardize hospital discharge preparation, medication prescribing, and dispensing. Discharge from the hospital should be planned with community pharmacies to ensure that all medications are available and, if necessary, doses of medications should be supplied by the hospital to bridge the gap. A partnership with outpatient healthcare professionals, such as general practitioners, community pharmacists, and homecare nurses, should be set up for effective asynchronous interprofessional collaboration to consolidate patients’ medication management, knowledge, and adherence, as well as to monitor signs of deterioration or adverse drug events.

Future research should consolidate our first attempt to develop a framework to better characterize medication at the transition of care, using Fig. 2   as a starting point. Contextualized interventions, co-designed by health professionals, patients and stakeholders, should be tested in a hybrid implementation study to test the implementation and effectiveness of the intervention for the health system [ 103 ].

Limitations

This study has some limitations. First, the transcripts were validated for accuracy by the interviewer but not by a third party, which could have increased the robustness of the transcription. Nevertheless, the interviewer followed all methodological recommendations for transcription. Second, patient inclusion took place during the COVID-19 pandemic, which may have had an impact on patient care and the availability of healthcare professionals. Third, we cannot guarantee the accuracy of some participants’ medication history before hospitalization, even though we contacted the participants’ main pharmacy, as participants could have gone to different pharmacies to obtain their medications. Fourth, our findings may not be generalizable to other populations and other healthcare systems because some issues may be specific to multimorbid patients with type 2 diabetes or to the Swiss healthcare setting. Nevertheless, issues encountered by our participants regarding their medications correlate with findings in the literature. Fifth, only 15 out of 21 participants took part in all the interviews, but most participants took part in at least three interviews and data saturation was reached. Lastly, by its qualitative and longitudinal design, it is possible that the discussion during interviews and participants’ reflections between interviews influenced participants’ management, knowledge, and adherence, even though this study was observational, and no advice or recommendations were given by the interviewer during interviews.

Discharged patients are willing to take steps to better manage, understand, and adhere to their medications, yet they are also faced with difficulties in the hospital and outpatient care. Furthermore, extensive changes in medications not only occur during hospitalization but also during the two months following hospital discharge, for which healthcare professionals should give particular attention. The different degrees of patients’ involvement, needs and resources should be carefully considered to enable them to better manage, understand and adhere to their medications. At a system level, patients’ experiences revealed a lack of standardization of medication practices during the transition of care. The healthcare system should provide the ecosystem needed for healthcare professionals responsible for or involved in the management of patients’ medications during the hospital stay, discharge, and outpatient care to standardize their practices while considering the patient as an active partner.

Data availability

The anonymized quantitative survey datasets and the qualitative codes are available in French from the corresponding author on reasonable request.

Abbreviations

adverse drug events

Adherence Visual Analogue Scale

Belief in Medication Questionnaire

Consolidated Criteria for Reporting Qualitative Research

case report form

standard deviation

World Health Organization

American Academy of Family Physician. Continuity of Care, Definition of 2020. Accessed 10 July 2022 https://www.aafp.org/about/policies/all/continuity-of-care-definition.html

Kripalani S, LeFevre F, Phillips CO, Williams MV, Basaviah P, Baker DW. Deficits in communication and information transfer between hospital-based and primary care physicians: implications for patient safety and continuity of care. JAMA. 2007;297(8):831–41.

Article   CAS   PubMed   Google Scholar  

World Health Organization (WHO). Medication Safety in Transitions of Care. 2019.

Forster AJ, Murff HJ, Peterson JF, Gandhi TK, Bates DW. The incidence and severity of adverse events affecting patients after discharge from the hospital. Ann Intern Med. 2003;138(3):161–7.

Article   PubMed   Google Scholar  

Krumholz HM. Post-hospital syndrome–an acquired, transient condition of generalized risk. N Engl J Med. 2013;368(2):100–2.

Article   CAS   PubMed   PubMed Central   Google Scholar  

Banholzer S, Dunkelmann L, Haschke M, Derungs A, Exadaktylos A, Krähenbühl S, et al. Retrospective analysis of adverse drug reactions leading to short-term emergency hospital readmission. Swiss Med Wkly. 2021;151:w20400.

Blozik E, Signorell A, Reich O. How does hospitalization affect continuity of drug therapy: an exploratory study. Ther Clin Risk Manag. 2016;12:1277–83.

Article   PubMed   PubMed Central   Google Scholar  

Allen J, Hutchinson AM, Brown R, Livingston PM. User experience and care for older people transitioning from hospital to home: patients’ and carers’ perspectives. Health Expect. 2018;21(2):518–27.

Daliri S, Bekker CL, Buurman BM, Scholte Op Reimer WJM, van den Bemt BJF, Karapinar-Çarkit F. Barriers and facilitators with medication use during the transition from hospital to home: a qualitative study among patients. BMC Health Serv Res. 2019;19(1):204.

Bekker CL, Mohsenian Naghani S, Natsch S, Wartenberg NS, van den Bemt BJF. Information needs and patient perceptions of the quality of medication information available in hospitals: a mixed method study. Int J Clin Pharm. 2020;42(6):1396–404.

Foulon V, Wuyts J, Desplenter F, Spinewine A, Lacour V, Paulus D, et al. Problems in continuity of medication management upon transition between primary and secondary care: patients’ and professionals’ experiences. Acta Clin Belgica: Int J Clin Lab Med. 2019;74(4):263–71.

Article   Google Scholar  

Micheli P, Kossovsky MP, Gerstel E, Louis-Simonet M, Sigaud P, Perneger TV, et al. Patients’ knowledge of drug treatments after hospitalisation: the key role of information. Swiss Med Wkly. 2007;137(43–44):614–20.

PubMed   Google Scholar  

Ziaeian B, Araujo KL, Van Ness PH, Horwitz LI. Medication reconciliation accuracy and patient understanding of intended medication changes on hospital discharge. J Gen Intern Med. 2012;27(11):1513–20.

Allen J, Hutchinson AM, Brown R, Livingston PM. User experience and care integration in Transitional Care for older people from hospital to home: a Meta-synthesis. Qual Health Res. 2016;27(1):24–36.

Mackridge AJ, Rodgers R, Lee D, Morecroft CW, Krska J. Cross-sectional survey of patients’ need for information and support with medicines after discharge from hospital. Int J Pharm Pract. 2018;26(5):433–41.

Mulhem E, Lick D, Varughese J, Barton E, Ripley T, Haveman J. Adherence to medications after hospital discharge in the elderly. Int J Family Med. 2013;2013:901845.

Fallis BA, Dhalla IA, Klemensberg J, Bell CM. Primary medication non-adherence after discharge from a general internal medicine service. PLoS ONE. 2013;8(5):e61735.

Zhou L, Rupa AP. Categorization and association analysis of risk factors for adverse drug events. Eur J Clin Pharmacol. 2018;74(4):389–404.

Moreau-Gruet F. La multimorbidité chez les personnes de 50 ans et plus. Résultats basés sur l’enqête SHARE (Survey of Health, Ageing and Retirement in Europe. Obsan Bulletin 4/2013. 2013(Neuchâtel: OBservatoire suisse de la santé).

Iglay K, Hannachi H, Joseph Howie P, Xu J, Li X, Engel SS, et al. Prevalence and co-prevalence of comorbidities among patients with type 2 diabetes mellitus. Curr Med Res Opin. 2016;32(7):1243–52.

Sibounheuang P, Olson PS, Kittiboonyakun P. Patients’ and healthcare providers’ perspectives on diabetes management: a systematic review of qualitative studies. Res Social Adm Pharm. 2020;16(7):854–74.

Müller-Wieland D, Merkel M, Hamann A, Siegel E, Ottillinger B, Woker R, et al. Survey to estimate the prevalence of type 2 diabetes mellitus in hospital patients in Germany by systematic HbA1c measurement upon admission. Int J Clin Pract. 2018;72(12):e13273.

Blanc AL, Fumeaux T, Stirnemann J, Dupuis Lozeron E, Ourhamoune A, Desmeules J, et al. Development of a predictive score for potentially avoidable hospital readmissions for general internal medicine patients. PLoS ONE. 2019;14(7):e0219348.

Hansen LO, Greenwald JL, Budnitz T, Howell E, Halasyamani L, Maynard G, et al. Project BOOST: effectiveness of a multihospital effort to reduce rehospitalization. J Hosp Med. 2013;8(8):421–7.

Khalid JM, Raluy-Callado M, Curtis BH, Boye KS, Maguire A, Reaney M. Rates and risk of hospitalisation among patients with type 2 diabetes: retrospective cohort study using the UK General Practice Research Database linked to English Hospital Episode statistics. Int J Clin Pract. 2014;68(1):40–8.

Lussier ME, Evans HJ, Wright EA, Gionfriddo MR. The impact of community pharmacist involvement on transitions of care: a systematic review and meta-analysis. J Am Pharm Assoc. 2020;60(1):153–.

van der Heijden A, de Bruijne MC, Nijpels G, Hugtenburg JG. Cost-effectiveness of a clinical medication review in vulnerable older patients at hospital discharge, a randomized controlled trial. Int J Clin Pharm. 2019;41(4):963–71.

Bingham J, Campbell P, Schussel K, Taylor AM, Boesen K, Harrington A, et al. The Discharge Companion Program: an interprofessional collaboration in Transitional Care Model Delivery. Pharm (Basel). 2019;7(2):68.

Google Scholar  

Farris KB, Carter BL, Xu Y, Dawson JD, Shelsky C, Weetman DB, et al. Effect of a care transition intervention by pharmacists: an RCT. BMC Health Serv Res. 2014;14:406.

Meslot C, Gauchet A, Hagger MS, Chatzisarantis N, Lehmann A, Allenet B. A Randomised Controlled Trial to test the effectiveness of planning strategies to improve Medication Adherence in patients with Cardiovascular Disease. Appl Psychol Health Well Being. 2017;9(1):106–29.

Garnier A, Rouiller N, Gachoud D, Nachar C, Voirol P, Griesser AC, et al. Effectiveness of a transition plan at discharge of patients hospitalized with heart failure: a before-and-after study. ESC Heart Fail. 2018;5(4):657–67.

Daliri S, Bekker CL, Buurman BM, Scholte Op Reimer WJM, van den Bemt BJF, Karapinar-Çarkit F. Medication management during transitions from hospital to home: a focus group study with hospital and primary healthcare providers in the Netherlands. Int J Clin Pharm. 2020.

Hansen LO, Young RS, Hinami K, Leung A, Williams MV. Interventions to reduce 30-day rehospitalization: a systematic review. Ann Intern Med. 2011;155(8):520–8.

Leppin AL, Gionfriddo MR, Kessler M, Brito JP, Mair FS, Gallacher K, et al. Preventing 30-day hospital readmissions: a systematic review and meta-analysis of randomized trials. JAMA Intern Med. 2014;174(7):1095–107.

Donzé J, John G, Genné D, Mancinetti M, Gouveia A, Méan M et al. Effects of a Multimodal Transitional Care Intervention in patients at high risk of readmission: the TARGET-READ Randomized Clinical Trial. JAMA Intern Med. 2023.

Rodrigues CR, Harrington AR, Murdock N, Holmes JT, Borzadek EZ, Calabro K, et al. Effect of pharmacy-supported transition-of-care interventions on 30-Day readmissions: a systematic review and Meta-analysis. Ann Pharmacother. 2017;51(10):866–89.

Lam MYY, Dodds LJ, Corlett SA. Engaging patients to access the community pharmacy medicine review service after discharge from hospital: a cross-sectional study in England. Int J Clin Pharm. 2019;41(4):1110–7.

Hossain LN, Fernandez-Llimos F, Luckett T, Moullin JC, Durks D, Franco-Trigo L, et al. Qualitative meta-synthesis of barriers and facilitators that influence the implementation of community pharmacy services: perspectives of patients, nurses and general medical practitioners. BMJ Open. 2017;7(9):e015471.

En-Nasery-de Heer S, Uitvlugt EB, Bet PM, van den Bemt BJF, Alai A, van den Bemt P et al. Implementation of a pharmacist-led transitional pharmaceutical care programme: process evaluation of medication actions to reduce hospital admissions through a collaboration between Community and Hospital pharmacists (MARCH). J Clin Pharm Ther. 2022.

Morris ZS, Wooding S, Grant J. The answer is 17 years, what is the question: understanding time lags in translational research. J R Soc Med. 2011;104(12):510–20.

De Geest S, Zúñiga F, Brunkert T, Deschodt M, Zullig LL, Wyss K, et al. Powering Swiss health care for the future: implementation science to bridge the valley of death. Swiss Med Wkly. 2020;150:w20323.

Noonan VK, Lyddiatt A, Ware P, Jaglal SB, Riopelle RJ, Bingham CO 3, et al. Montreal Accord on patient-reported outcomes (PROs) use series - paper 3: patient-reported outcomes can facilitate shared decision-making and guide self-management. J Clin Epidemiol. 2017;89:125–35.

Hesselink G, Schoonhoven L, Barach P, Spijker A, Gademan P, Kalkman C, et al. Improving patient handovers from hospital to primary care: a systematic review. Ann Intern Med. 2012;157(6):417–28.

(OFSP) Interprofessionnalité dans le domaine de la santé Soins ambulatoire. Accessed 4 January 2024. https://www.bag.admin.ch/bag/fr/home/strategie-und-politik/nationale-gesundheitspolitik/foerderprogramme-der-fachkraefteinitiative-plus/foerderprogramme-interprofessionalitaet.html

Mitchell SE, Laurens V, Weigel GM, Hirschman KB, Scott AM, Nguyen HQ, et al. Care transitions from patient and caregiver perspectives. Ann Fam Med. 2018;16(3):225–31.

Davoody N, Koch S, Krakau I, Hägglund M. Post-discharge stroke patients’ information needs as input to proposing patient-centred eHealth services. BMC Med Inf Decis Mak. 2016;16:66.

Ozavci G, Bucknall T, Woodward-Kron R, Hughes C, Jorm C, Joseph K, et al. A systematic review of older patients’ experiences and perceptions of communication about managing medication across transitions of care. Res Social Adm Pharm. 2021;17(2):273–91.

Fylan B, Armitage G, Naylor D, Blenkinsopp A. A qualitative study of patient involvement in medicines management after hospital discharge: an under-recognised source of systems resilience. BMJ Qual Saf. 2018;27(7):539–46.

Fylan B, Marques I, Ismail H, Breen L, Gardner P, Armitage G, et al. Gaps, traps, bridges and props: a mixed-methods study of resilience in the medicines management system for patients with heart failure at hospital discharge. BMJ Open. 2019;9(2):e023440.

Brandberg C, Ekstedt M, Flink M. Self-management challenges following hospital discharge for patients with multimorbidity: a longitudinal qualitative study of a motivational interviewing intervention. BMJ Open. 2021;11(7):e046896.

Lawton J, Rankin D, Peel E, Douglas M. Patients’ perceptions and experiences of transitions in diabetes care: a longitudinal qualitative study. Health Expect. 2009;12(2):138–48.

Mabire C, Bachnick S, Ausserhofer D, Simon M. Patient readiness for hospital discharge and its relationship to discharge preparation and structural factors: a cross-sectional study. Int J Nurs Stud. 2019;90:13–20.

Meyers DC, Durlak JA, Wandersman A. The quality implementation framework: a synthesis of critical steps in the implementation process. Am J Community Psychol. 2012;50(3–4):462–80.

Meyer-Massetti C, Hofstetter V, Hedinger-Grogg B, Meier CR, Guglielmo BJ. Medication-related problems during transfer from hospital to home care: baseline data from Switzerland. Int J Clin Pharm. 2018;40(6):1614–20.

Neeman M, Dobrinas M, Maurer S, Tagan D, Sautebin A, Blanc AL, et al. Transition of care: a set of pharmaceutical interventions improves hospital discharge prescriptions from an internal medicine ward. Eur J Intern Med. 2017;38:30–7.

Geese F, Schmitt KU. Interprofessional Collaboration in Complex Patient Care Transition: a qualitative multi-perspective analysis. Healthc (Basel). 2023;11(3).

Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Petticrew M. Developing and evaluating complex interventions: the new Medical Research Council guidance. Int J Nurs Stud. 2013;50(5):587–92.

Thomson R, Plumridge L, Holland J, Editorial. Int J Soc Res Methodol. 2003;6(3):185–7.

Audulv Å, Hall EOC, Kneck Å, Westergren T, Fegran L, Pedersen MK, et al. Qualitative longitudinal research in health research: a method study. BMC Med Res Methodol. 2022;22(1):255.

Kim H, Sefcik JS, Bradway C. Characteristics of qualitative descriptive studies: a systematic review. Res Nurs Health. 2017;40(1):23–42.

Sandelowski M. Whatever happened to qualitative description? Res Nurs Health. 2000;23(4):334–40.

Bradshaw C, Atkinson S, Doody O. Employing a qualitative description Approach in Health Care Research. Glob Qual Nurs Res. 2017;4:2333393617742282.

PubMed   PubMed Central   Google Scholar  

Bellone JM, Barner JC, Lopez DA. Postdischarge interventions by pharmacists and impact on hospital readmission rates. J Am Pharm Assoc (2003). 2012;52(3):358–62.

Hennink MM, Kaiser BN, Marconi VC. Code saturation versus meaning saturation: how many interviews are Enough? Qual Health Res. 2016;27(4):591–608.

World Health Organization. Adherence to long-term therapies: evidence for action. 2003.

Fisher JD, Fisher WA, Amico KR, Harman JJ. An information-motivation-behavioral skills model of adherence to antiretroviral therapy. Health Psychol. 2006;25(4):462–73.

Bandura A. Health promotion from the perspective of social cognitive theory. Psychol Health. 1998;13(4):623–49.

ShiftEUROQOL Research FOndation EQ 5D Instruments. Accessed 30 July 2022 https://euroqol.org/eq-5d-instruments/sample-demo/

Jeppesen KM, Coyle JD, Miser WF. Screening questions to predict limited health literacy: a cross-sectional study of patients with diabetes mellitus. Ann Fam Med. 2009;7(1):24–31.

Giordano TP, Guzman D, Clark R, Charlebois ED, Bangsberg DR. Measuring adherence to antiretroviral therapy in a diverse population using a visual analogue scale. HIV Clin Trials. 2004;5(2):74–9.

Horne R, Weinman J, Hankins M. The beliefs about medicines questionnaire: the development and evaluation of a new method for assessing the cognitive representation of medication. Psychol Health. 1999;14(1):1–24.

Horne R, Chapman SC, Parham R, Freemantle N, Forbes A, Cooper V. Understanding patients’ adherence-related beliefs about medicines prescribed for long-term conditions: a meta-analytic review of the necessity-concerns Framework. PLoS ONE. 2013;8(12):e80633.

Braun V, Clarke V. Using thematic analysis in psychology. Qualitative Res Psychol. 2006;3(2):77–101.

Waibel S, Henao D, Aller M-B, Vargas I, Vázquez M-L. What do we know about patients’ perceptions of continuity of care? A meta-synthesis of qualitative studies. Int J Qual Health Care. 2011;24(1):39–48.

Rognan SE, Jørgensen MJ, Mathiesen L, Druedahl LC, Lie HB, Bengtsson K, et al. The way you talk, do I have a choice?’ Patient narratives of medication decision-making during hospitalization. Int J Qualitative Stud Health Well-being. 2023;18(1):2250084.

Michel B, Hemery M, Rybarczyk-Vigouret MC, Wehrle P, Beck M. Drug-dispensing problems community pharmacists face when patients are discharged from hospitals: a study about 537 prescriptions in Alsace. Int J Qual Health Care. 2016;28(6):779–84.

Bruhwiler LD, Hersberger KE, Lutters M. Hospital discharge: what are the problems, information needs and objectives of community pharmacists? A mixed method approach. Pharm Pract (Granada). 2017;15(3):1046.

Knight DA, Thompson D, Mathie E, Dickinson A. Seamless care? Just a list would have helped!’ Older people and their carer’s experiences of support with medication on discharge home from hospital. Health Expect. 2013;16(3):277–91.

Gualandi R, Masella C, Viglione D, Tartaglini D. Exploring the hospital patient journey: what does the patient experience? PLoS ONE. 2019;14(12):e0224899.

Norberg H, Håkansson Lindqvist M, Gustafsson M. Older individuals’ experiences of Medication Management and Care after Discharge from Hospital: an interview study. Patient Prefer Adherence. 2023;17:781–92.

Jones KC, Austad K, Silver S, Cordova-Ramos EG, Fantasia KL, Perez DC, et al. Patient perspectives of the hospital discharge process: a qualitative study. J Patient Exp. 2023;10:23743735231171564.

Hesselink G, Flink M, Olsson M, Barach P, Dudzik-Urbaniak E, Orrego C, et al. Are patients discharged with care? A qualitative study of perceptions and experiences of patients, family members and care providers. BMJ Qual Saf. 2012;21(Suppl 1):i39–49.

Murray SA, Kendall M, Carduff E, Worth A, Harris FM, Lloyd A, et al. Use of serial qualitative interviews to understand patients’ evolving experiences and needs. BMJ. 2009;339:b3702.

Berger ZD, Boss EF, Beach MC. Communication behaviors and patient autonomy in hospital care: a qualitative study. Patient Educ Couns. 2017;100(8):1473–81.

Davis RE, Jacklin R, Sevdalis N, Vincent CA. Patient involvement in patient safety: what factors influence patient participation and engagement? Health Expect. 2007;10(3):259–67.

Greene J, Hibbard JH. Why does patient activation matter? An examination of the relationships between patient activation and health-related outcomes. J Gen Intern Med. 2012;27(5):520–6.

Mitchell SE, Gardiner PM, Sadikova E, Martin JM, Jack BW, Hibbard JH, et al. Patient activation and 30-day post-discharge hospital utilization. J Gen Intern Med. 2014;29(2):349–55.

Weir DL, Motulsky A, Abrahamowicz M, Lee TC, Morgan S, Buckeridge DL, et al. Failure to follow medication changes made at hospital discharge is associated with adverse events in 30 days. Health Serv Res. 2020;55(4):512–23.

Kripalani S, Goggins K, Nwosu S, Schildcrout J, Mixon AS, McNaughton C, et al. Medication nonadherence before hospitalization for Acute Cardiac events. J Health Commun. 2015;20(Suppl 2):34–42.

Mortelmans L, De Baetselier E, Goossens E, Dilles T. What happens after Hospital Discharge? Deficiencies in Medication Management encountered by geriatric patients with polypharmacy. Int J Environ Res Public Health. 2021;18(13).

Mortelmans L, Goossens E, Dilles T. Beliefs about medication after hospital discharge in geriatric patients with polypharmacy. Geriatr Nurs. 2022;43:280–7.

Bandiera C, Ribaut J, Dima AL, Allemann SS, Molesworth K, Kalumiya K et al. Swiss Priority setting on implementing Medication Adherence interventions as Part of the European ENABLE COST action. Int J Public Health. 2022;67.

Elliott R, Boyd M, Nde S. at e. Supporting adherence for people starting a new medication for a long-term condition through community pharmacies: a pragmaticrandomised controlled trial of the New Medicine Service. 2015.

Grimmsmann T, Schwabe U, Himmel W. The influence of hospitalisation on drug prescription in primary care–a large-scale follow-up study. Eur J Clin Pharmacol. 2007;63(8):783–90.

Larsen MD, Rosholm JU, Hallas J. The influence of comprehensive geriatric assessment on drug therapy in elderly patients. Eur J Clin Pharmacol. 2014;70(2):233–9.

Viktil KK, Blix HS, Eek AK, Davies MN, Moger TA, Reikvam A. How are drug regimen changes during hospitalisation handled after discharge: a cohort study. BMJ Open. 2012;2(6):e001461.

Strehlau AG, Larsen MD, Søndergaard J, Almarsdóttir AB, Rosholm J-U. General practitioners’ continuation and acceptance of medication changes at sectorial transitions of geriatric patients - a qualitative interview study. BMC Fam Pract. 2018;19(1):168.

Anderson TS, Lee S, Jing B, Fung K, Ngo S, Silvestrini M, et al. Prevalence of diabetes medication intensifications in older adults discharged from US Veterans Health Administration Hospitals. JAMA Netw Open. 2020;3(3):e201511.

Royal Pharmaceutical Society. Keeping patients safewhen they transfer between care providers– getting the medicines right June 2012. Accessed 27 October 2023 https://www.rpharms.com/Portals/0/RPS%20document%20library/Open%20access/Publications/Keeping%20patients%20safe%20transfer%20of%20care%20report.pdf

International Pharmaceutical Federation (FIP). Medicines reconciliation: A toolkit for pharmacists. Accessed 23 September 2023 https://www.fip.org/file/4949

Californian Pharmacist Assiociation Transitions of Care Resource Guide. https://cdn.ymaws.com/www.cshp.org/resource/resmgr/Files/Practice-Policy/For_Pharmacists/transitions_of_care_final_10.pdf

Royal Collegue of Physicians. Medication safety at hospital discharge: Improvement guide and resource. Accessed 18 September 2023 https://www.rcplondon.ac.uk/file/33421/download

Douglas N, Campbell W, Hinckley J. Implementation science: buzzword or game changer. J Speech Lang Hear Res. 2015;58.

Download references

Acknowledgements

The authors would like to thank all the patients who took part in this study. We would also like to thank the Geneva University Hospitals Patients Partners + 3P platform as well as Mrs. Tourane Corbière and Mr. Joël Mermoud, patient partners, who reviewed interview guides for clarity and significance. We would like to thank Samuel Fabbi, Vitcoryavarman Koh, and Pierre Repiton for the transcriptions of the audio recordings.

This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors.

Open access funding provided by University of Geneva

Author information

Authors and affiliations.

School of Pharmaceutical Sciences, University of Geneva, Geneva, Switzerland

Léa Solh Dost & Marie P. Schneider

Institute of Pharmaceutical Sciences of Western Switzerland, University of Geneva, Geneva, Switzerland

Division of Endocrinology, Diabetes, Hypertension and Nutrition, Department of Medicine, Geneva University Hospitals, Geneva, Switzerland

Giacomo Gastaldi

You can also search for this author in PubMed   Google Scholar

Contributions

LS, GG, and MS conceptualized and designed the study. LS and GG screened and recruited participants. LS conducted the interviews. LS, GG, and MS performed data analysis and interpretation. LS drafted the manuscript and LS and MS worked on the different versions. MS and GG approved the final manuscript.

Corresponding authors

Correspondence to Léa Solh Dost or Marie P. Schneider .

Ethics declarations

Ethics approval and consent to participate.

Ethics approval was sought and granted by the Cantonal Research Ethics Commission, Geneva (CCER) (2020 − 01779), and informed consent to participate was obtained from all participants.

Consent for publication

Informed consent for publication was obtained from all participants.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 1

Supplementary material 2, supplementary material 3, rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Solh Dost, L., Gastaldi, G. & Schneider, M. Patient medication management, understanding and adherence during the transition from hospital to outpatient care - a qualitative longitudinal study in polymorbid patients with type 2 diabetes. BMC Health Serv Res 24 , 620 (2024). https://doi.org/10.1186/s12913-024-10784-9

Download citation

Received : 28 June 2023

Accepted : 26 February 2024

Published : 13 May 2024

DOI : https://doi.org/10.1186/s12913-024-10784-9

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Continuity of care
  • Transition of care
  • Patient discharge
  • Medication management
  • Medication adherence
  • Qualitative research
  • Longitudinal studies
  • Patient-centered care
  • Interprofessional collaboration
  • Type 2 diabetes

BMC Health Services Research

ISSN: 1472-6963

type of research analysis

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • My Account Login
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Open access
  • Published: 14 May 2024

Transcriptomic analysis-guided assessment of precision-cut tumor slices (PCTS) as an ex-vivo tool in cancer research

  • Sumita Trivedi 1 ,
  • Caitlin Tilsed 2 ,
  • Maria Liousia 2 ,
  • Robert M. Brody 3 ,
  • Karthik Rajasekaran 3 ,
  • Sunil Singhal 4 ,
  • Steven M. Albelda 2 &
  • Astero Klampatsa 5  

Scientific Reports volume  14 , Article number:  11006 ( 2024 ) Cite this article

Metrics details

  • Biological models
  • Biological techniques
  • Gene expression analysis

With cancer immunotherapy and precision medicine dynamically evolving, there is greater need for pre-clinical models that can better replicate the intact tumor and its complex tumor microenvironment (TME). Precision-cut tumor slices (PCTS) have recently emerged as an ex vivo human tumor model, offering the opportunity to study individual patient responses to targeted therapies, including immunotherapies. However, little is known about the physiologic status of PCTS and how culture conditions alter gene expression. In this study, we generated PCTS from head and neck cancers (HNC) and mesothelioma tumors (Meso) and undertook transcriptomic analyses to understand the changes that occur in the timeframe between PCTS generation and up to 72 h (hrs) in culture. Our findings showed major changes occurring during the first 24 h culture period of PCTS, involving genes related to wound healing, extracellular matrix, hypoxia, and IFNγ-dependent pathways in both tumor types, as well as tumor-specific changes. Collectively, our data provides an insight into PCTS physiology, which should be taken into consideration when designing PCTS studies, especially in the context of immunology and immunotherapy.

Similar content being viewed by others

type of research analysis

Genome-wide transcriptome profiling of ex-vivo precision-cut slices from human pancreatic ductal adenocarcinoma

type of research analysis

Deep sequencing and automated histochemistry of human tissue slice cultures improve their usability as preclinical model for cancer research

type of research analysis

Organotypic culture assays for murine and human primary and metastatic-site tumors

Introduction.

Advancements in cancer therapy and precision medicine highlight the greater need for preclinical models that can recapitulate the complex interactions between tumor cells and the tumor microenvironment (TME) to identify optimal treatment strategies. Although cell culture, murine models, and patient-derived xenografts provide key information about tumor cell biology, they are unable to replicate the complex human TME and extracellular matrix in a single model 1 , 2 , 3 . Precision cut tumor slices (PCTS), which are generated by cutting thin, viable cross sections of fresh tumors, offer a unique approach to study human solid tumors within an architecturally intact microenvironment where spatial relationships are left largely intact 4 , 5 .

Although an appealing tool, PCTS utilization is subject to at least one major potential confounding issue that merits detailed consideration. The tumor tissue that is sliced into PCTS is subjected to several stressors, including: cold ischemia during harvesting in surgery, physical trauma of tissue slicing into PCTS, loss of blood flow, potential hypoxia, and maintenance under conditions optimized for cell culture. Without a good understanding of the effects of these stressors on the physiology and gene expression in PCTS, the interpretation of any changes induced by experimental manipulations studies will be difficult.

To date, most evaluations of the physiologic status of PCTS have relied on histological appearance to assess the duration of time they are able to maintain a normal morphology 6 , 7 . This approach, however, is rather insensitive and limited by the small number of parameters that can be examined. Histologic approaches are also insufficient to look at multiple physiologic and pathologic pathways. A much more robust approach to obtain a more extensive evaluation is to use transcriptomics, where changes in many thousands of genes and many key pathways can be simultaneously evaluated over time. Surprisingly, there is a relatively paucity of genomic data available, especially in tumors, and some of these data are contradictory. Bigaeva et al. provided a comprehensive characterization of the dynamic transcriptional changes in PCTS from normal and fibrotic mouse and human tissues using mRNA sequencing comparing fresh PCTS to those cultured for 48 h 8 . They demonstrated that explantation and culture were associated with extensive transcriptional changes and, interestingly, impacted PCTS in a relatively similar way across organs in both species by triggering an inflammatory response and fibrosis-related extracellular matrix (ECM) remodeling. In marked contrast, Ghaderi et al. 4 did mRNA sequencing of formalin-fixed paraffin embedded (FFPE) tissue from five pancreatic ductal cancer samples at baseline and their matched PCTS cultured at 24, 48, and 72 h. The number of differentially expressed genes that they reported was extremely small, ranging from zero to only 56 genes.

The goal of this study was to examine the gene expression changes occurring in PCTS fresh human tumor samples over time. Our tumor data were consistent with those of Bigaeva et al. in normal tissues, finding extensive spontaneous changes at the transcriptomic level that tended to be most prominent in the first 24 h of culture with relative stabilization from 24 to 72 h 8 . These transcriptomic changes need to be considered when designing and analyzing PCTS experiments.

HNC and Meso PCTS maintain normal tumor architecture and demonstrate up to 72 h viability in culture

H&E stains of FFPE sections from PCTS of both HNCs and Mesos demonstrated morphological characteristics that are typical of these cancers (Fig.  1 A,B) . Over the course of a 72 h culture, PCTS maintained structural integrity without visible necrosis over time as shown in Fig.  1 C,D respectively.

figure 1

PCTS morphology and viability. H&E staining of fixed PCTS 3 μm sections showed that generation of PCTS using the Compresstome instrument retained intact tumor morphology in both HNC ( A , 4X) and Meso tumors ( B , 10X). PCTS stained every 24 h for 3 days showed that tumor architecture is preserved during the 72-h timeframe for both tumor types, with no signs of necrosis seen microscopically, suggesting the PCTS are viable ( C , 4X images; D, 10X images).

PCTS undergo major culture-induced transcriptomic changes in the first 24h of culture

Transcriptomic analyses were conducted on PCTS from tumors from HNC (n = 3) and Meso (n = 3) patient samples. All tumors had at least one freshly cut PCTS that was immediately fixed (“Fresh”) and PCTS analyzed after 24 h in culture (“24 h”). In a subset of patients, data was also collected from PCTS at 48 and 72 h after culture.

Using unsupervised hierarchical clustering (Supplementary Fig.  1 A) and principal component analysis (Supplementary Fig.  1 B), we found that the Meso PCTS were tightly grouped. Two of the HNC PCTS were similar in phenotype, while one case (HNC 3) displayed a somewhat different gene expression profile. In all cases, the fresh and 24 h samples from each patient clustered together. We initially focused on the total number of transcriptomic changes that occurred over the first 24 h after culture (Fresh vs 24 h). Using the criteria described in the methods section, hundreds of genes showed changes (Supplementary Fig.  1 C), with more genes downregulated than upregulated. The 25 most increased and 25 most decreased genes are listed in Suppl. Table 1 .

Pathway analysis of PCTS transcriptomic changes reveals upregulation of wound healing and ECM pathways and downregulation of TCR activation and IFN-gamma signaling

We next conducted a pathway analysis of the changed genes. We noted that some genes were changed in both tumor types, but that there were also genes that appeared to change in the HNC but not in the Meso tumors and vice versa. In the HNC PCTS, increases in cell-to cell-junctional molecules and TGF-β and VEGF-related signaling pathways were seen, with downregulation in some chemokine/cytokine pathways. In the Meso PCTS, increases in VEGF and PI3K pathways were observed (Fig.  2 ). Our primary analyses performed on genes that were changed in both Meso and HNC revealed several significantly changed GO biological (Fig.  3 A), GO molecular (Fig.  3 B), and Reactome pathways (Fig.  3 C). Of note were upregulation of pathways involved in extracellular matrix and fibrin reorganization and fibrinolysis, wound healing, angiogenesis, PI3K activation, and a subset of inflammatory and immune response genes. On the other hand, there were downregulation of pathways involved in TCR activation, IFNγ signaling, compliment activation, HLA Class 2 expression, and a different subset of inflammatory and immune response genes.

figure 2

Pathways that changed in either HNC and Meso. Pathway analysis (Reactome) using lists of ranked genes that were changed ( p  < 0.05 and fold change > 2) in only the HNC PCTS ( A ) or only the Meso PCTS ( B ).

figure 3

Pathways that changed in both HNC and Meso. Significantly changed genes were analyzed using the GO Biologic ( A ), GO Molecular ( B ), and Reactome Pathways ( C ).

Gene comparisons-housekeeping and tumor genes

Given these pathway changes and our interest in using the PCTS to study immunologic questions, we compared the fold changes in mRNA expression levels of Fresh versus 24 h PCTS in several key categories in Fig.  4 . A set of housekeeping genes (Fig.  4 A, Suppl. Table 2 A and Suppl. Figure  2 A) were all expressed at high levels, however, with a great deal of basal heterogeneity between and among tumor types. There was a slight trend toward downregulation of these genes at 24 h, but no statistically significant differences, suggesting no widespread changes in cell viability. A set of tumor selective genes (Fig.  4 B, Suppl. Table 2 B and Supplementary Fig.  2 B) were expressed heterogeneously. As would be expected, expression levels of epithelial-related mRNAs, like EGFR and E-cadherin, were much higher in HNCs. There were little changes in these genes. In contrast, most of the tumor-related genes in the Meso PCTS were downregulated at 24 h, suggesting some loss of tumor cells.

figure 4

Summary of Gene Expression Changes. The log2 fold (X-axes) changes in mRNA expression levels at 24 h in a number of specific genes in key categories including ( A ) housekeeping genes, ( B ) tumor selective genes, ( C ) T cells genes, ( D ) macrophage genes, ( E ) neutrophil genes, ( F ) endothelial genes, ( G ) fibroblast genes, ( H ) myeloid attracting chemokine genes, ( I ) T cell/NK cell attracting chemokine genes, ( J ) cytokine genes, ( K )collagen genes, ( L ) extracellular matrix proteins genes, ( M ) HLA Class 1 and antigen presentation genes, ( N ) HLA Class 2 genes, ( O ) Interferon gamma-induced genes, ( P ) TGFβ induced genes, ( Q ) Wound healing signature genes, ( R ) Hypoxia-induced genes, ( S ) EMT-mesenchymal genes, ( T ) EMT-epithelial genes, ( U ) Proliferation/Cell Cycle genes. Red = downregulated at 24 h, green = upregulated at 24 h. p values calculated by Wilcoxon test ( p  > 0.05). p values for significant or borderline significant changes are noted on the graphs.

Gene comparisons- cell type selective genes

We next evaluated mRNAs that are characteristically expressed in specific cell types. The T cell selective genes were expressed with basal heterogeneity (Fig.  4 C, Suppl. Table 2 C and Suppl. Figure  2 C). One of the Meso tumors was much “hotter” than the other PCTS where the expression levels were rather low (Supplementary Fig.  2 C). There was a trend towards decrease in T cell genes (8 of 9 mRNAs were decreased), but most of the changes were not significant. The degree of downregulation of T cell genes in the HNCs in the 24 h PCTS was greater than seen in the Meso PCTS. Macrophage selective genes (Fig.  4 D, Supplementary Table 2 D, Supplementary Fig.  2 D) showed a great deal of basal heterogeneity, but the expression levels were higher in Meso PCTS. The general macrophage marker (CD68) was significantly increased at 24 h, but CD14 and ITGAM were not. However, the M2 marker genes (CSFR1, CD163, MS4A4A, and MRC1) were all decreased, suggesting a shift to a more M1-like phenotype. No significant changes were noted in neutrophil genes (Fig.  4 E, Supplementary Table 2 E and Supplementary Fig.  2 E). There was heterogeneity in endothelial selective genes (Fig.  4 F, Supplementary Table 2 F and Supplementary Fig.  2 F). Most endothelial cell genes were downregulated in the 24 h PCTS, with the changes being significant or near significant. As expected, basal expression of fibroblast selective genes were higher in the Meso compared to the HNC PCTS (Fig.  4 G, Supplementary Table 2 G and Supplementary Fig.  2 G). Most of the fibroblast genes were downregulated with changes in THY1, MMP2, and ACTA2 being significant.

Gene comparisons-chemokine and cytokine genes

Levels of chemokines known to attract myeloid cells were first examined (Fig.  4 H, Supplementary Table 3 A, Supplementary Fig.  3 A). Basal levels of CCL2 were much higher in the Meso vs HNC PCTS. Compared to baseline, expression of all myeloid chemokine mRNAs was higher in the 24 h PCTS and among the largest increases we observed in any gene set. The fold increases were higher in the Meso compared to HNC PCTS. In contrast, expression of the lymphoid-attracting chemokine mRNAs examined were much lower in the 24 h PCTS (F i g.  4 I, Supplementary Table 3 B and Supplementary Fig.  3 B). We observed statistically significant and large decreases in expression in the CXCL9, CXCL10, and CCL5 genes. The decreases in expression were greater in the HNC compared with Meso PCTS. There was some heterogeneity in the basal levels, but in general, these chemokines were higher in the Meso tumors. Most cytokine mRNAs examined showed relatively few changes (Fig.  4 J, Supplementary Table 3 C and Supplementary Fig.  3 C). However, IL-6 and IL-11 showed large and significant increases at 24 h. IL-1A, IL-1B, and IL-33 showed increases that did not reach statistical significance. TNFα and IFNγ mRNAs were present at only low levels.

Gene comparisons-extracellular matrix (ECM) genes

Given the results of our pathway analysis above showing changes in ECM gene expression, we examined the expression levels of collagen mRNAs and other ECM mRNAs (Fig.  4 K, Supplementary Table 3 D, and Supplementary Fig.  3 D). There was tumor-specific basal heterogeneity, but we observed a consistent down-regulation of Collagen 1, 3, and 4 mRNAs. Regarding other ECM genes, there was some tumor-specific basal heterogeneity (Fig.  4 L, Supplementary Table 3 E, and Supplementary Fig.  3 E), but most were decreased at 24 h, with more marked decreases in the Meso vs HNC PCTS.

Gene comparisons- HLA class 1, class 2, and antigen presentation machinery (APM)

HLA-Class 1 genes, B2M, and APM genes were expressed at very different levels among the PCTS (Fig.  4 M, Supplementary Table 4 A, Supplementary Fig.  4 A). There was a slight trend toward downregulation of the HLA Class I genes and antigen presenting machinery mRNAs in HNC compared with Meso, but no significant changes were seen. In contrast, there was a very consistent and strong downregulation in the mRNAs of HLA Class II genes and Class II-regulating transcription factors, with more downregulation in HNC vs Meso PCTS (Fig.  4 N, Supplementary Table 4 B, Supplementary Fig.  4 B).

Gene comparisons-IFNγ signature

A large amount of heterogeneity in the basal levels of mRNAs of genes induced by IFNγ was seen (Fig.  4 O, Supplementary Table 4 C, Supplementary Fig.  4 C), but most genes were down regulated (4 significantly), except for IDO. The decreases in expression were greater in the HNC vs Meso PCTS.

Gene comparisons-TGF-β-induced genes

No clear pattern was evident in a panel of well described TGF-β-induced genes. Aside from TGF-β1 itself, all other mRNAs of TGF-β1 genes were higher in Meso PCTS. IL-11, SERPINE1, and INHBA mRNAs were significantly increased, however, most other genes decreased or were unchanged (Fig.  4 P, Suppl. Table 4 D, Supplementary Fig.  4 D).

Gene comparisons-wound healing genes

Genes from a wound healing signature described by Vitali et al. were examined (Fig.  4 Q, Suppl. Table 5 B, Supplementary Fig.  5 B) 9 . There was tumor-specific basal heterogeneity, but there was a trend towards an increase in the wound signature mRNAs especially in the Meso PCTS; all but COL3A1 genes were increased, but only FGF5 reached significance.

Gene comparisons-hypoxia-induced genes

All the hypoxia genes examined showed large and mostly significant increases at 24 h (Fig.  4 R, Suppl. Table 5 A, Suppl Fig.  5 A), apart from NFE2L2 (NRF1). The hypoxic response was generally greater in the Meso PCTS.

Gene comparisons-epithelial to mesenchymal transition (EMT) genes

Sets of genes that define a mesenchymal phenotype (Fig.  4 S, Suppl. Table 5 C, Supplementary Fig.  5 C) or epithelial phenotype (Fig.  4 T, Suppl. Table 5 D, Supplementary Fig.  5 D) were compared. As expected, the mesenchymal genes were higher at baseline in Meso versus HNC tumors (since Meso is a more mesenchymal tumor), but there was no clear trend toward increased mesenchymal gene expression. In contrast, basal levels of many epithelial genes were much higher in HNC PCTS than meso PCTS. Except for CDH1, there was no significant downregulation of epithelial genes, however.

Gene comparisons-proliferation-related genes and cell cycle inhibitor genes

There were no clear changes in the proliferation-related mRNAs (Fig.  4 U, Suppl. Table 6 A, Supplementary Fig.  5 E), with a slight trend toward decreased proliferation gene expression seen in the HNC PCTS. Changes in cell cycle inhibitor mRNAs were mixed: p21 and p15 increased, but p53 and p27 decreased (Fig.  4 U, Suppl. Table 6 B, Supplementary Fig.  5 E).

Gene expression changes over time mostly occur in the first 24 h post-slicing

In general, the largest changes in gene expression were seen between the fresh and the 24 h PCTS. Values then tended to plateau, with the most stable period between 24 and 48 h. Examples of this pattern for three downregulated genes are shown in Fig.  5 , E-cadherin (CDH1) (Fig.  5 A), KDR (Fig.  5 B), and COL4A2 (Fig.  5 C). Examples of this pattern for three upregulated genes are IL-11 (Fig.  5 D), CXCL3 (Fig.  5 E), and HIF1A (Fig.  5 F). Supplemental Fig.  6 depicts the mRNA expression changes over time of representative housekeeping (6A), tumor (6B), T cell-selective (6C), macrophage-selective (6D), endothelial-selective 6E), fibroblast selective genes (6F), cytokine (6G), myeloid-attracting chemokine (6H), lymphocyte attracting chemokine (6I), extracellular matrix (6J), hypoxia (6K) or miscellaneous (6L) mRNAs.

figure 5

Gene Expression Changes over Time. The log2 gene expression values in 4 tumor PCTS (3 Meso and 1 HNC) were measured at baseline (time 0), and at 24, 48, and 72 h after culture. Examples of genes downregulated include ( A ) E-cadherin (CDH1), ( B ) VEGFR2 (KDR), and ( C ) collagen4A2. Examples of genes upregulated include ( D ) Interleukin 11 (IL-11), E) CXCL3, and ( F ) HIF1A.

Recent advances in cancer research have expanded treatment options for many cancers, yet their use remains limited to select populations 10 , 11 . Thus, there is a need for predictive preclinical models with which we can understand translational failures and accelerate the development of novel therapeutics. Historically, 2D and 3D cell cultures, murine and other animal models, as well as patient-derived xenografts, have been utilized for preclinical studies, however, they are limited by their inability to fully recapitulate the complex human TME. PCTS are an ex vivo system that largely preserves the 3D tumor architecture, along with its TME and stroma. They are relatively easy to produce, and they can offer a platform for conducting therapy-based experiments on human solid tumor tissue 5 . However, our current understanding of the complex changes that occur during slice production and culture (even without any treatment-induced changers) is quite limited. In this study, we used transcriptomics to comprehensively study the mRNA changes that occur in two tumor types, namely HNC and Meso, especially within the first 24 h following slicing (i.e. PCTS production).

Our data showed that PCTS from HNC and Meso retained histological fidelity for up to 72 h in culture. Other studies using PCTS derived from HNCs have demonstrated a similar viability of 2–7 days 6 , 7 , 12 . This is likely tumor-type dependent, as others have reported that PCTS from other tumor types (murine tumors, colorectal, non-small cell lung cancer and liver tumors) were viable for up to 12 days 8 , 13 , 14 , 15 . It may also be culture-specific since, in our experience, different culture media, with or without serum supplementation, and different culture plates can affect PCTS viability 12 . Collectively, these data demonstrate the importance of (1) assessing viability for each type of tumor studied and (2) fully optimizing the culture conditions employed.

The major finding of this study is that, although the PCTS appear to remain relatively unchanged histologically in the first 72 h of culture, they undergo major transcriptomic changes during this period, especially during the first 24 h following slicing. Unlike the study of Ghaderi et al. 4 which reported very few changes (zero to only 56 genes) in PCTS after culture, our tumor data are similar to the results published by Bigaeva et al. 8 that demonstrated extensive transcriptional changes from PCTS made from normal and fibrotic mouse and human tissues.

We observed that the expression of some cell type-specific mRNAs decreased. This included mRNAs associated with T cells (especially the mRNA for CD4), fibroblasts, and most prominently, endothelial cells. Interestingly, although major macrophage-specific mRNAs (CD68, CD14, and ITGAM) were relatively unchanged, the mRNAs associated with an M2 phenotype (CSFR1, CD163, MS4A4A, and CRC1) were decreased, suggesting a change toward a more M1, anti-tumor status.

One of the major changes we observed was in the chemokine/cytokine milieu of the PCTS. IFNγ-related mRNAs were consistently downregulated, including the mRNAs for chemokines that attract lymphocytes (such as CXCL9, 10, 11 and CCL5) and the HLA Class II-regulating mRNAs (i.e., CIITA and RFX5). Expression of HLA Class I and antigen-presenting machinery mRNAs were less affected. By contrast, mRNAs for chemokines that attract myeloid cells (such as CXCL2, CXCL5, CXCL8, and CCL2), were markedly and significantly upregulated. In addition, several inflammatory cytokine gene mRNAs were highly increased, including IL-1A, IL-1B, TNF, and IL-6. One of the most highly upregulated mRNAs was IL-11, which was also prominently increased in the study examining PCTS of normal tissues 8 , 16 . IL-11 is a member of the IL-6 family of cytokines and has been implicated in the pathogenesis of fibrosis and solid malignancy, as well as inflammation 17 , 18 .

Another prominent set of changes that we observed involve the upregulation of mRNAs regulating tissue injury and repair, including the wound-induced matrix protein tenascin. This initial response may be the result of the traumatic effects of slicing the PCTS and was also observed in the normal tissue PCTS 8 . However, many ECM mRNAs were downregulated, including many collagen genes, as well as the extracellular matrix proteins, Perlecan and Lumican, especially in the Meso PCTS. Consistent with matrix remodeling, mRNA’s for MMPs 1, 3, 9, and 10 were markedly upregulated. TGF-β1 mRNA and several TGF-β stimulated genes (IL-11, Serpine 1, INHBA) were significantly increased. Finally, there was transcriptomic evidence of a hypoxic response, with many hypoxia-associated mRNAs (VEGFA, VEGFC, HIF1a, CXCL8, PLAUR, and MMP3) showing significant or close to significant upregulation.

Overall, our transcriptomic analysis data in tumors agrees with that previously seen in normal tissues 8 and suggest that, when tumors are removed from the body, sliced into PCTS, and cultured ex vivo for 24 h, they undergo a complex set of changes. There is a tendency toward loss of endothelial cells, fibroblasts, and lymphocytes, and a shift towards an M1 vs M2 macrophage phenotype. This is associated with a downregulation of IFNγ-induced mRNAs and processes. The other prominent changes relate to wounding and hypoxia responses that are highlighted by marked changes in the cytokine/chemokine milieu and extracellular matrix. These wounding and hypoxic programs likely begin the same way as would be observed in vivo but are altered in the PCTS because the normal influx of blood cells (i.e. platelets, neutrophils, monocytes, fibrocytes, etc.) and angiogenesis, cannot occur due to lack any vascular connections. The largest changes seemed to occur in the initial 24 h of PCTS culture and then tended to stabilize. This is of importance as it suggests that experimental manipulations to the PCTS would be best be studied 24 h after slicing to allow the PCTS to “de-stress”, adapt to the culture conditions and mRNA levels to stabilize.

Given these rather substantial changes, each investigator will need to account for how these changes might affect specific experiments and consider validating these changes at the protein level. Our findings suggest that studies looking at effects of manipulations on the tumor cells themselves (i.e., via addition of chemotherapy or drugs), might be relatively unaffected 19 , 20 , 21 , but are likely tumor cell specific. In our examples, tumor-selective mRNAs in the HNCs (EGFR, CDH1) were stable, however, mesothelioma-selective mRNAs (i.e., mesothelin and WT1) were reduced. Studies requiring antigen presentation might be misleading. It is unclear how experiments in which cells are added to PCTS to monitor migration would be affected 21 , 22 , 23 , because of the changes occurring in the extracellular matrix. In the handful of studies that have utilized PCTS for CAR-T cell or genetically-engineered macrophage assessment, results have been reported to show targeted tumor cell death with associated cytokine release in the supernatant 23 , 24 , with the immune responses seen in the macrophage study shown to be consistent between patient-derived PCTS 23 . In our opinion, the utility of PCTS as a platform in adoptive cell transfer research is promising but requires further investigation.

There are a number of limitations to our study that should be considered. We studied only two types of tumors, each showing its own specific changes. Although most changes that we observed were similar between the two tumor types, basal levels of many mRNAs were quite different amongst the tumors. Some changes occurred in the Meso PCTS, but not the HNC PCTS, and vice versa, we also saw examples of changes in opposite directions. For example, after culture, COL7A1 mRNA was significantly decreased in HNC, but significantly increased in Meso. Accordingly, each tumor type should be carefully studied for specific changes. Another limitation is that our transcriptomic data was derived from only a small number of tumors and lacked multiple biological replicates at all time points. Because of this, our primary method of analysis was to use each tumor as its own control and calculate fold changes over time rather than comparing the mean values of all PCTS at different time points. However, despite having a relatively low level of statistical power, many of the changes observed were consistent and significant using paired t-tests. The validity of our conclusions is bolstered by the somewhat surprising similarities between the upregulated mRNAs that we observed in our tumor slices compared to those reported previously in normal tissues 8 . Remarkably, all the 6 most upregulated mRNAs increased in the PCTS from normal tissues (IL-6, CXCL5, CXCL8, MMP1, MMP3, and TFPI2) were among our top 36 most upregulated genes and had an average fold increase of 24.4. However, additional transcriptomic data to define changes in PCTS over time would benefit the field. Finally, this study focused only on generating transcriptomic data. Given the potential lack of correlation between mRNA and protein expression, especially in tumors, it would be of value to validate relevant transcriptomic findings using proteomic methods such as immunoblotting, immunohistochemistry and/or flow cytometry. Preliminary studies by our group suggest that it is possible to obtain enough cells to conduct multi-color flow cytometry from individual slices.

In summary, the ability of PCTS to retain the original tumor microenvironment and architecture makes them an attractive model to study tumor biology, therapies, immunology and immunotherapies. However, the investigator should be aware that there are multiple dynamic transcriptional changes that occur after slicing and during the early culturing of PCTS that can differ among tumor types and among patients. Our findings indicate that this complex program of changes involves the potential decrease of several cell types within the PCTS, along with wounding and hypoxia responses that are highlighted by marked changes in the mRNA levels of cytokines, chemokines, HLA Class II molecules, and extracellular matrix. Depending on the experimental questions asked, these “baseline” changes need to be carefully considered.

Materials and methods

Tissue samples.

Surplus resected tumor material from human head and neck (HNC) and mesothelioma (Meso) tumors were obtained from the operating room immediately after surgical resection. Informed written consent related to two IRB approvals (Penn IRB protocols #813004 and #417200) were obtained prior to donation. Samples were transported from the operating room to the laboratory in ice cold media (DMEM/F12, 10% FBS, 1% Penicillin/Streptavidin) and processed within 1 h. The tissue from HNC (n = 3) and Meso (n = 3) tumors were used for transcriptomic evaluation.

Preparation of precision cut tissue PCTS

PCTS of 300 μm in thickness were prepared using a Compresstome (Precisionary Instruments LLC, VF-300). Tumor samples were mounted on the tissue plunger and embedded in 2% low-melt agarose, as previously described 5 . Several PCTS were placed immediately in 10% formalin for fixation and labeled “Fresh/Time 0/0 h”. The remaining PCTS were placed on top of cell culture inserts (Millipore) in 24-well tissue culture plate and cultured in DMEM/F12 media supplemented with 10% FBS and 1% penicillin/streptomycin at 37 °C in a 5% CO 2 incubator. At various time points (24, 48, and 72 h), the PCTS were fixed in 10% formalin. All fixed PCTS were subsequently embedded in paraffin and standard 3 μm sections were cut and placed on glass slides. Slides were used for H&E or transcriptomic analyses.

HTG transcriptome panel

FFPE slides were reviewed, and areas of tumor marked by a head and neck pathologist. Depending on the size, one to three marked slides from each PCTS were sent to HTG Molecular Diagnostics at the HTG VERI/O commercial laboratory in Tucson, Arizona. For this study, we used a newly available HTG Transcriptome Panel that interrogated 19,398 genes simultaneously ( https://www.htgmolecular.com/assays/htp ). HTG EdgeSeq probes target a single location of each RNA transcript resulting in a single probe sequence per RNA transcript. The counts therefore are stoichiometrically equal to the number of transcripts in the sample and normalization by transcript length is not required. The HTG EdgeSeq data is transformed into gene counts per million counts (CPM) and normalized CPM gene reads were provided by HTG. Genes with low expression levels (less than 1 CPM) were filtered leaving approximately 11,000 genes for analysis.

Differentially expressed gene analysis

Given our goal of examining changes in the PCTS over time, the relatively small number of PCTS examined, the known differences between HNC and Meso, the presumed heterogeneity between individual tumors of even the same histology, and the lack of biological replicates at many of our time points, our primary method of analysis was to use each baseline (Time 0) PCTS as the control for comparison to later time points from the same tumor, rather than comparing the averages of groups of PCTS at different time points. This allowed us to use paired analyses and gave us six comparisons at each time point for each gene to analyze. For each of the six tumors, we calculated the fold change of any given gene at 24 h compared to time zero and calculated the median fold change. We then screened for genes that had a paired t test value of < 0.05 (uncorrected) and looked at various fold-change thresholds.

Due to our relatively small sample sizes, we generated lists of changed genes for pathway analysis using the following criteria (with a bias towards being inclusive). All genes with an average greater than twofold change from the Time 0 PCTS to the 24 h PCTS were reviewed manually. If the p value was > 0.05, we reviewed for outliers. If consistent changes were seen except for one outlier, the gene was kept in the list. If there were no clear outliers, the gene was cut from the list. Our primary analysis was performed on genes that were changed in both lung mesothelioma and head and neck cancer.

We also conducted a pathway analysis of the differentially expressed genes. For pathway analysis and Gene Ontology analysis, up- and down-regulated DEG were uploaded to InnateDB 25 . InnateDB looks for over representation of DEG within the KEGG, REACTOME and Gene Ontology databases. p values were adjusted for multiple comparisons using the B–H method and an FDR < 0.05 considered significant 26 .

Data availability

Gene expression data were deposited into the Gene Expression Omnibus database under accession number GSE250038. To review GEO accession GSE250038, go to: https://www.ncbi.nlm.nih.gov/geo/query/acc.cgi?acc=GSE250038 . Token: mhadwayyvzupvmd.

Ireson, C. R., Alavijeh, M. S., Palmer, A. M., Fowler, E. R. & Jones, H. J. The role of mouse tumour models in the discovery and development of anticancer drugs. Br. J. Cancer 121 , 101–108 (2019).

Article   PubMed   PubMed Central   Google Scholar  

Law, A. M. K. et al. Advancements in 3D cell culture systems for personalizing anti-cancer therapies. Front. Oncol. 11 , 782766 (2021).

Article   CAS   PubMed   PubMed Central   Google Scholar  

Kim, M. et al. Patient-derived lung cancer organoids as in vitro cancer models for therapeutic screening. Nat. Commun. 10 , 3991 (2019).

Article   ADS   PubMed   PubMed Central   Google Scholar  

Ghaderi, M. et al. Genome-wide transcriptome profiling of ex-vivo precision-cut slices from human pancreatic ductal adenocarcinoma. Sci. Rep. 10 , 9070 (2020).

Article   ADS   CAS   PubMed   PubMed Central   Google Scholar  

Dimou, P., Trivedi, S., Liousia, M., D’Souza, R. R. & Klampatsa, A. Precision-cut tumor slices (PCTS) as an ex vivo model in immunotherapy research. Antibodies (Basel) 11 , 26 (2022).

Article   CAS   PubMed   Google Scholar  

Gerlach, M. M. et al. Slice cultures from head and neck squamous cell carcinoma: A novel test system for drug susceptibility and mechanisms of resistance. Br. J. Cancer 110 , 479–488 (2014).

Runge, A. et al. Patient-derived head and neck tumor slice cultures: A versatile tool to study oncolytic virus action. Sci. Rep. 12 , 15334 (2022).

Bigaeva, E. et al. Transcriptomic characterization of culture-associated changes in murine and human precision-cut tissue slices. Arch. Toxicol. 93 , 3549–3583 (2019).

Vitali, F. et al. Exploring wound-healing genomic machinery with a network-based approach. Pharmaceuticals (Basel) 10 , 55 (2017).

Article   PubMed   Google Scholar  

Roy, R., Singh, S. K. & Misra, S. Advancements in cancer immunotherapies. Vaccines (Basel) 11 , 59 (2022).

Liu, C., Yang, M., Zhang, D., Chen, M. & Zhu, D. Clinical cancer immunotherapy: Current progress and prospects. Front. Immunol. 13 , 961805 (2022).

Greier, M. D. C. et al. Optimizing culturing conditions in patient derived 3D primary slice cultures of head and neck cancer. Front. Oncol. 13 , 1145817 (2023).

Junk, D. et al. Human tissue cultures of lung cancer predict patient susceptibility to immune-checkpoint inhibition. Cell Death Discov. 7 , 264 (2021).

Jabbari, N. et al. Modulation of immune checkpoints by chemotherapy in human colorectal liver metastases. Cell. Rep. Med. 1 , 100160 (2020).

Westra, I. M. et al. Human precision-cut liver slices as a model to test antifibrotic drugs in the early onset of liver fibrosis. Toxicol. In Vitro 35 , 77–85 (2016).

Fung, K. Y. et al. Emerging roles for IL-11 in inflammatory diseases. Cytokine 149 , 155750 (2022).

Putoczki, T. L. & Ernst, M. IL-11 signaling as a therapeutic target for cancer. Immunotherapy 7 , 441–453 (2015).

Naipal, K. A. et al. Tumor slice culture system to assess drug response of primary breast cancer. BMC Cancer 16 , 78 (2016).

Roife, D. et al. Ex vivo testing of patient-derived xenografts mirrors the clinical outcome of patients with pancreatic ductal adenocarcinoma. Clin. Cancer Res. 22 , 6021–6030 (2016).

Tieu, T. et al. Patient-derived prostate cancer explants: A clinically relevant model to assess siRNA-based nanomedicines. Adv. Healthc. Mater. 10 , e2001594 (2021).

Peranzoni, E. et al. Ex vivo imaging of resident CD8 T lymphocytes in human lung tumor slices using confocal microscopy. J. Vis. Exp. 130 , e55709 (2017).

Google Scholar  

Boutet, M. et al. TGFbeta signaling intersects with CD103 integrin signaling to promote T-lymphocyte accumulation and antitumor activity in the lung tumor microenvironment. Cancer Res. 76 , 1757–1769 (2016).

Brempelis, K. J. et al. Genetically engineered macrophages persist in solid tumors and locally deliver therapeutic proteins to activate immune responses. J. Immunother. Cancer 8 , 2 (2020).

Article   Google Scholar  

Kantari-Mimoun, C. et al. CAR T-cell entry into tumor islets is a two-step process dependent on IFNgamma and ICAM-1. Cancer Immunol. Res. 9 , 1425–1438 (2021).

Breuer, K. et al. InnateDB: Systems biology of innate immunity and beyond-recent updates and continuing curation. Nucleic Acids Res. 41 , D1228-1233 (2013).

Chen, S. Y., Feng, Z. & Yi, X. A general introduction to adjustment for multiple comparisons. J. Thorac. Dis. 9 , 1725–1729 (2017).

Download references

Funding was provided by ASCO Young Investigator Award, National Institutes of Health grant no. P01- CA217805, Penn Medicine Head and Neck Cancer Grant, Cris Cancer Foundation and June Hancock Mesothelioma Research Fund.

Author information

Authors and affiliations.

Division of Hematology and Oncology, Department of Medicine, University of North Carolina at Chapel Hill, Charlotte, NC, USA

Sumita Trivedi

Division of Pulmonary and Critical Care Medicine, Department of Medicine, Center for Cellular Immunology, University of Pennsylvania, Philadelphia, PA, USA

Caitlin Tilsed, Maria Liousia & Steven M. Albelda

Department of Otorhinolaryngology-Head and Neck Surgery, University of Pennsylvania, Philadelphia, PA, USA

Robert M. Brody & Karthik Rajasekaran

Division of Thoracic Surgery, Department of Surgery, University of Pennsylvania, Philadelphia, PA, USA

Sunil Singhal

Division of Cancer Therapeutics, The Institute of Cancer Research, London, UK

Astero Klampatsa

You can also search for this author in PubMed   Google Scholar

Contributions

S.T., S.M.A. and A.K. conceptualized and designed the study. S.T., C.T., M.L., J.B., R.M.B., K.R., A.K. contributed to the experimental work. S.T., C.T., S.M.A. and A.K. analyzed experimental findings and prepared all figures. A.K. wrote the main manuscript, with editing help from S.T., C.T. and S.M.A. All authors have reviewed the manuscript.

Corresponding author

Correspondence to Astero Klampatsa .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Supplementary information., rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Trivedi, S., Tilsed, C., Liousia, M. et al. Transcriptomic analysis-guided assessment of precision-cut tumor slices (PCTS) as an ex-vivo tool in cancer research. Sci Rep 14 , 11006 (2024). https://doi.org/10.1038/s41598-024-61684-1

Download citation

Received : 13 March 2024

Accepted : 08 May 2024

Published : 14 May 2024

DOI : https://doi.org/10.1038/s41598-024-61684-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Precision-cut slices
  • Immune response
  • Transcriptomics
  • Mesothelioma
  • Tumor model

By submitting a comment you agree to abide by our Terms and Community Guidelines . If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

Sign up for the Nature Briefing: Cancer newsletter — what matters in cancer research, free to your inbox weekly.

type of research analysis

medRxiv

Impact of the use of cannabis as a medicine in pregnancy, on the unborn child: a systematic review and meta-analysis protocol

  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • For correspondence: [email protected]
  • Info/History
  • Preview PDF

Introduction: The use of cannabis for medicinal purposes is on the rise. As more people place their trust in the safety of prescribed alternative plant-based medicine and find it easily accessible, there is a growing concern that pregnant women may be increasingly using cannabis for medicinal purposes to manage their pregnancy symptoms and other health conditions. The aim of this review is to investigate the use of cannabis for medicinal purposes during pregnancy, describe the characteristics of the demographic population, and to measure the impact on the unborn child and up to twelve months postpartum. Methods and analyses: Research on pregnant women who use cannabis for medicinal purposes only and infants up to one year after birth who experienced in utero exposure to cannabis for medicinal purposes will be included in this review. Reviews, randomised controlled trials, case control, cross-sectional and cohort studies, that have been peer reviewed and published between 1996 and April 2024 as a primary research paper that investigates prenatal use of cannabis for medicinal purposes on foetal, perinatal, and neonatal outcomes, will be selected for review. Excluding cover editorials, letters, commentaries, protocols, conference papers and book chapters. Effects of illicit drugs use, alcohol misuse and nicotine exposure on neonate outcome will be controlled by excluding studies reporting on the concomitant use of such substances with cannabis for medicinal purposes during pregnancy. All titles and abstracts will be reviewed independently and in duplicate by at least two researchers. Records will be excluded based on title and abstract screening as well as publication type. Where initial disagreement exists between reviewers regarding the inclusion of a study, team members will review disputed articles status until consensus is gained. Selected studies will then be assessed by at least two independent researchers for risk bias assessment using validated tools. Data will be extracted and analysed following a systematic review and meta-analysis methodology. The statistical analysis will combine three or more outcomes that are reported in a consistent manner. The systematic review and meta-analysis will follow the PRISMA guidelines to facilitate transparent reporting [1].

Competing Interest Statement

The authors have declared no competing interest.

Funding Statement

This study did not receive any funding.

Author Declarations

I confirm all relevant ethical guidelines have been followed, and any necessary IRB and/or ethics committee approvals have been obtained.

The details of the IRB/oversight body that provided approval or exemption for the research described are given below:

The study will use ONLY openly available human data from studies published in biomedical and scientific journals.

I confirm that all necessary patient/participant consent has been obtained and the appropriate institutional forms have been archived, and that any patient/participant/sample identifiers included were not known to anyone (e.g., hospital staff, patients or participants themselves) outside the research group so cannot be used to identify individuals.

I understand that all clinical trials and any other prospective interventional studies must be registered with an ICMJE-approved registry, such as ClinicalTrials.gov. I confirm that any such study reported in the manuscript has been registered and the trial registration ID is provided (note: if posting a prospective study registered retrospectively, please provide a statement in the trial ID field explaining why the study was not registered in advance).

I have followed all appropriate research reporting guidelines, such as any relevant EQUATOR Network research reporting checklist(s) and other pertinent material, if applicable.

Data Availability

All data produced in the present work are contained in the manuscript.

View the discussion thread.

Thank you for your interest in spreading the word about medRxiv.

NOTE: Your email address is requested solely to identify you as the sender of this article.

Reddit logo

Citation Manager Formats

  • EndNote (tagged)
  • EndNote 8 (xml)
  • RefWorks Tagged
  • Ref Manager
  • Tweet Widget
  • Facebook Like
  • Google Plus One
  • Addiction Medicine (323)
  • Allergy and Immunology (627)
  • Anesthesia (163)
  • Cardiovascular Medicine (2365)
  • Dentistry and Oral Medicine (287)
  • Dermatology (206)
  • Emergency Medicine (378)
  • Endocrinology (including Diabetes Mellitus and Metabolic Disease) (833)
  • Epidemiology (11758)
  • Forensic Medicine (10)
  • Gastroenterology (702)
  • Genetic and Genomic Medicine (3725)
  • Geriatric Medicine (348)
  • Health Economics (632)
  • Health Informatics (2388)
  • Health Policy (929)
  • Health Systems and Quality Improvement (895)
  • Hematology (340)
  • HIV/AIDS (780)
  • Infectious Diseases (except HIV/AIDS) (13300)
  • Intensive Care and Critical Care Medicine (767)
  • Medical Education (365)
  • Medical Ethics (104)
  • Nephrology (398)
  • Neurology (3486)
  • Nursing (198)
  • Nutrition (522)
  • Obstetrics and Gynecology (673)
  • Occupational and Environmental Health (661)
  • Oncology (1818)
  • Ophthalmology (535)
  • Orthopedics (218)
  • Otolaryngology (286)
  • Pain Medicine (232)
  • Palliative Medicine (66)
  • Pathology (445)
  • Pediatrics (1031)
  • Pharmacology and Therapeutics (426)
  • Primary Care Research (420)
  • Psychiatry and Clinical Psychology (3171)
  • Public and Global Health (6133)
  • Radiology and Imaging (1276)
  • Rehabilitation Medicine and Physical Therapy (744)
  • Respiratory Medicine (825)
  • Rheumatology (379)
  • Sexual and Reproductive Health (372)
  • Sports Medicine (322)
  • Surgery (400)
  • Toxicology (50)
  • Transplantation (172)
  • Urology (145)
  • Privacy Policy

Research Method

Home » Research Methodology – Types, Examples and writing Guide

Research Methodology – Types, Examples and writing Guide

Table of Contents

Research Methodology

Research Methodology

Definition:

Research Methodology refers to the systematic and scientific approach used to conduct research, investigate problems, and gather data and information for a specific purpose. It involves the techniques and procedures used to identify, collect , analyze , and interpret data to answer research questions or solve research problems . Moreover, They are philosophical and theoretical frameworks that guide the research process.

Structure of Research Methodology

Research methodology formats can vary depending on the specific requirements of the research project, but the following is a basic example of a structure for a research methodology section:

I. Introduction

  • Provide an overview of the research problem and the need for a research methodology section
  • Outline the main research questions and objectives

II. Research Design

  • Explain the research design chosen and why it is appropriate for the research question(s) and objectives
  • Discuss any alternative research designs considered and why they were not chosen
  • Describe the research setting and participants (if applicable)

III. Data Collection Methods

  • Describe the methods used to collect data (e.g., surveys, interviews, observations)
  • Explain how the data collection methods were chosen and why they are appropriate for the research question(s) and objectives
  • Detail any procedures or instruments used for data collection

IV. Data Analysis Methods

  • Describe the methods used to analyze the data (e.g., statistical analysis, content analysis )
  • Explain how the data analysis methods were chosen and why they are appropriate for the research question(s) and objectives
  • Detail any procedures or software used for data analysis

V. Ethical Considerations

  • Discuss any ethical issues that may arise from the research and how they were addressed
  • Explain how informed consent was obtained (if applicable)
  • Detail any measures taken to ensure confidentiality and anonymity

VI. Limitations

  • Identify any potential limitations of the research methodology and how they may impact the results and conclusions

VII. Conclusion

  • Summarize the key aspects of the research methodology section
  • Explain how the research methodology addresses the research question(s) and objectives

Research Methodology Types

Types of Research Methodology are as follows:

Quantitative Research Methodology

This is a research methodology that involves the collection and analysis of numerical data using statistical methods. This type of research is often used to study cause-and-effect relationships and to make predictions.

Qualitative Research Methodology

This is a research methodology that involves the collection and analysis of non-numerical data such as words, images, and observations. This type of research is often used to explore complex phenomena, to gain an in-depth understanding of a particular topic, and to generate hypotheses.

Mixed-Methods Research Methodology

This is a research methodology that combines elements of both quantitative and qualitative research. This approach can be particularly useful for studies that aim to explore complex phenomena and to provide a more comprehensive understanding of a particular topic.

Case Study Research Methodology

This is a research methodology that involves in-depth examination of a single case or a small number of cases. Case studies are often used in psychology, sociology, and anthropology to gain a detailed understanding of a particular individual or group.

Action Research Methodology

This is a research methodology that involves a collaborative process between researchers and practitioners to identify and solve real-world problems. Action research is often used in education, healthcare, and social work.

Experimental Research Methodology

This is a research methodology that involves the manipulation of one or more independent variables to observe their effects on a dependent variable. Experimental research is often used to study cause-and-effect relationships and to make predictions.

Survey Research Methodology

This is a research methodology that involves the collection of data from a sample of individuals using questionnaires or interviews. Survey research is often used to study attitudes, opinions, and behaviors.

Grounded Theory Research Methodology

This is a research methodology that involves the development of theories based on the data collected during the research process. Grounded theory is often used in sociology and anthropology to generate theories about social phenomena.

Research Methodology Example

An Example of Research Methodology could be the following:

Research Methodology for Investigating the Effectiveness of Cognitive Behavioral Therapy in Reducing Symptoms of Depression in Adults

Introduction:

The aim of this research is to investigate the effectiveness of cognitive-behavioral therapy (CBT) in reducing symptoms of depression in adults. To achieve this objective, a randomized controlled trial (RCT) will be conducted using a mixed-methods approach.

Research Design:

The study will follow a pre-test and post-test design with two groups: an experimental group receiving CBT and a control group receiving no intervention. The study will also include a qualitative component, in which semi-structured interviews will be conducted with a subset of participants to explore their experiences of receiving CBT.

Participants:

Participants will be recruited from community mental health clinics in the local area. The sample will consist of 100 adults aged 18-65 years old who meet the diagnostic criteria for major depressive disorder. Participants will be randomly assigned to either the experimental group or the control group.

Intervention :

The experimental group will receive 12 weekly sessions of CBT, each lasting 60 minutes. The intervention will be delivered by licensed mental health professionals who have been trained in CBT. The control group will receive no intervention during the study period.

Data Collection:

Quantitative data will be collected through the use of standardized measures such as the Beck Depression Inventory-II (BDI-II) and the Generalized Anxiety Disorder-7 (GAD-7). Data will be collected at baseline, immediately after the intervention, and at a 3-month follow-up. Qualitative data will be collected through semi-structured interviews with a subset of participants from the experimental group. The interviews will be conducted at the end of the intervention period, and will explore participants’ experiences of receiving CBT.

Data Analysis:

Quantitative data will be analyzed using descriptive statistics, t-tests, and mixed-model analyses of variance (ANOVA) to assess the effectiveness of the intervention. Qualitative data will be analyzed using thematic analysis to identify common themes and patterns in participants’ experiences of receiving CBT.

Ethical Considerations:

This study will comply with ethical guidelines for research involving human subjects. Participants will provide informed consent before participating in the study, and their privacy and confidentiality will be protected throughout the study. Any adverse events or reactions will be reported and managed appropriately.

Data Management:

All data collected will be kept confidential and stored securely using password-protected databases. Identifying information will be removed from qualitative data transcripts to ensure participants’ anonymity.

Limitations:

One potential limitation of this study is that it only focuses on one type of psychotherapy, CBT, and may not generalize to other types of therapy or interventions. Another limitation is that the study will only include participants from community mental health clinics, which may not be representative of the general population.

Conclusion:

This research aims to investigate the effectiveness of CBT in reducing symptoms of depression in adults. By using a randomized controlled trial and a mixed-methods approach, the study will provide valuable insights into the mechanisms underlying the relationship between CBT and depression. The results of this study will have important implications for the development of effective treatments for depression in clinical settings.

How to Write Research Methodology

Writing a research methodology involves explaining the methods and techniques you used to conduct research, collect data, and analyze results. It’s an essential section of any research paper or thesis, as it helps readers understand the validity and reliability of your findings. Here are the steps to write a research methodology:

  • Start by explaining your research question: Begin the methodology section by restating your research question and explaining why it’s important. This helps readers understand the purpose of your research and the rationale behind your methods.
  • Describe your research design: Explain the overall approach you used to conduct research. This could be a qualitative or quantitative research design, experimental or non-experimental, case study or survey, etc. Discuss the advantages and limitations of the chosen design.
  • Discuss your sample: Describe the participants or subjects you included in your study. Include details such as their demographics, sampling method, sample size, and any exclusion criteria used.
  • Describe your data collection methods : Explain how you collected data from your participants. This could include surveys, interviews, observations, questionnaires, or experiments. Include details on how you obtained informed consent, how you administered the tools, and how you minimized the risk of bias.
  • Explain your data analysis techniques: Describe the methods you used to analyze the data you collected. This could include statistical analysis, content analysis, thematic analysis, or discourse analysis. Explain how you dealt with missing data, outliers, and any other issues that arose during the analysis.
  • Discuss the validity and reliability of your research : Explain how you ensured the validity and reliability of your study. This could include measures such as triangulation, member checking, peer review, or inter-coder reliability.
  • Acknowledge any limitations of your research: Discuss any limitations of your study, including any potential threats to validity or generalizability. This helps readers understand the scope of your findings and how they might apply to other contexts.
  • Provide a summary: End the methodology section by summarizing the methods and techniques you used to conduct your research. This provides a clear overview of your research methodology and helps readers understand the process you followed to arrive at your findings.

When to Write Research Methodology

Research methodology is typically written after the research proposal has been approved and before the actual research is conducted. It should be written prior to data collection and analysis, as it provides a clear roadmap for the research project.

The research methodology is an important section of any research paper or thesis, as it describes the methods and procedures that will be used to conduct the research. It should include details about the research design, data collection methods, data analysis techniques, and any ethical considerations.

The methodology should be written in a clear and concise manner, and it should be based on established research practices and standards. It is important to provide enough detail so that the reader can understand how the research was conducted and evaluate the validity of the results.

Applications of Research Methodology

Here are some of the applications of research methodology:

  • To identify the research problem: Research methodology is used to identify the research problem, which is the first step in conducting any research.
  • To design the research: Research methodology helps in designing the research by selecting the appropriate research method, research design, and sampling technique.
  • To collect data: Research methodology provides a systematic approach to collect data from primary and secondary sources.
  • To analyze data: Research methodology helps in analyzing the collected data using various statistical and non-statistical techniques.
  • To test hypotheses: Research methodology provides a framework for testing hypotheses and drawing conclusions based on the analysis of data.
  • To generalize findings: Research methodology helps in generalizing the findings of the research to the target population.
  • To develop theories : Research methodology is used to develop new theories and modify existing theories based on the findings of the research.
  • To evaluate programs and policies : Research methodology is used to evaluate the effectiveness of programs and policies by collecting data and analyzing it.
  • To improve decision-making: Research methodology helps in making informed decisions by providing reliable and valid data.

Purpose of Research Methodology

Research methodology serves several important purposes, including:

  • To guide the research process: Research methodology provides a systematic framework for conducting research. It helps researchers to plan their research, define their research questions, and select appropriate methods and techniques for collecting and analyzing data.
  • To ensure research quality: Research methodology helps researchers to ensure that their research is rigorous, reliable, and valid. It provides guidelines for minimizing bias and error in data collection and analysis, and for ensuring that research findings are accurate and trustworthy.
  • To replicate research: Research methodology provides a clear and detailed account of the research process, making it possible for other researchers to replicate the study and verify its findings.
  • To advance knowledge: Research methodology enables researchers to generate new knowledge and to contribute to the body of knowledge in their field. It provides a means for testing hypotheses, exploring new ideas, and discovering new insights.
  • To inform decision-making: Research methodology provides evidence-based information that can inform policy and decision-making in a variety of fields, including medicine, public health, education, and business.

Advantages of Research Methodology

Research methodology has several advantages that make it a valuable tool for conducting research in various fields. Here are some of the key advantages of research methodology:

  • Systematic and structured approach : Research methodology provides a systematic and structured approach to conducting research, which ensures that the research is conducted in a rigorous and comprehensive manner.
  • Objectivity : Research methodology aims to ensure objectivity in the research process, which means that the research findings are based on evidence and not influenced by personal bias or subjective opinions.
  • Replicability : Research methodology ensures that research can be replicated by other researchers, which is essential for validating research findings and ensuring their accuracy.
  • Reliability : Research methodology aims to ensure that the research findings are reliable, which means that they are consistent and can be depended upon.
  • Validity : Research methodology ensures that the research findings are valid, which means that they accurately reflect the research question or hypothesis being tested.
  • Efficiency : Research methodology provides a structured and efficient way of conducting research, which helps to save time and resources.
  • Flexibility : Research methodology allows researchers to choose the most appropriate research methods and techniques based on the research question, data availability, and other relevant factors.
  • Scope for innovation: Research methodology provides scope for innovation and creativity in designing research studies and developing new research techniques.

Research Methodology Vs Research Methods

About the author.

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

Research Paper Citation

How to Cite Research Paper – All Formats and...

Data collection

Data Collection – Methods Types and Examples

Delimitations

Delimitations in Research – Types, Examples and...

Research Paper Formats

Research Paper Format – Types, Examples and...

Research Process

Research Process – Steps, Examples and Tips

Research Design

Research Design – Types, Methods and Examples

IMAGES

  1. 6 Types of Qualitative Research Methods

    type of research analysis

  2. A Case Study and Meta-Analysis of Type 2 Diabetes Research

    type of research analysis

  3. Types of Research Methodology

    type of research analysis

  4. What is Data Analytics? A Complete Guide for Beginners

    type of research analysis

  5. Research methods and types of analysis

    type of research analysis

  6. Methods of qualitative data analysis.

    type of research analysis

VIDEO

  1. Research Methods Definitions Types and Examples

  2. Metho 2: Types of Research

  3. Data Analysis in Research

  4. O+ Blood Type Key Characteristics

  5. Types of Research with examples

  6. Nigerian Universities In Need Of Funds For Research

COMMENTS

  1. Data Analysis: Types, Methods & Techniques (a Complete List)

    Note: basic descriptive statistics such as mean, median, and mode, as well as standard deviation, are not shown because most people are already familiar with them.In the diagram, they would fall under the "descriptive" analysis type. Tree Diagram Explained. The highest-level classification of data analysis is quantitative vs qualitative.Quantitative implies numbers while qualitative ...

  2. Research Methods

    Research methods are specific procedures for collecting and analyzing data. Developing your research methods is an integral part of your research design. When planning your methods, there are two key decisions you will make. First, decide how you will collect data. Your methods depend on what type of data you need to answer your research question:

  3. Data Analysis in Research: Types & Methods

    Definition of research in data analysis: According to LeCompte and Schensul, research data analysis is a process used by researchers to reduce data to a story and interpret it to derive insights. The data analysis process helps reduce a large chunk of data into smaller fragments, which makes sense. Three essential things occur during the data ...

  4. What Is a Research Design

    Step 1: Consider your aims and approach. Step 2: Choose a type of research design. Step 3: Identify your population and sampling method. Step 4: Choose your data collection methods. Step 5: Plan your data collection procedures. Step 6: Decide on your data analysis strategies. Other interesting articles.

  5. Data Analysis

    Data Analysis. Definition: Data analysis refers to the process of inspecting, cleaning, transforming, and modeling data with the goal of discovering useful information, drawing conclusions, and supporting decision-making. It involves applying various statistical and computational techniques to interpret and derive insights from large datasets.

  6. Research Methods

    To analyse data collected in a statistically valid manner (e.g. from experiments, surveys, and observations). Meta-analysis. Quantitative. To statistically analyse the results of a large collection of studies. Can only be applied to studies that collected data in a statistically valid manner.

  7. Quantitative Data Analysis Methods & Techniques 101

    How to choose the right analysis method. To choose the right statistical methods, you need to think about two important factors: The type of quantitative data you have (specifically, level of measurement and the shape of the data). And, Your research questions and hypotheses; Let's take a closer look at each of these. Factor 1 - Data type

  8. Learning to Do Qualitative Data Analysis: A Starting Point

    For many researchers unfamiliar with qualitative research, determining how to conduct qualitative analyses is often quite challenging. Part of this challenge is due to the seemingly limitless approaches that a qualitative researcher might leverage, as well as simply learning to think like a qualitative researcher when analyzing data. From framework analysis (Ritchie & Spencer, 1994) to content ...

  9. Research Methods--Quantitative, Qualitative, and More: Overview

    About Research Methods. This guide provides an overview of research methods, how to choose and use them, and supports and resources at UC Berkeley. As Patten and Newhart note in the book Understanding Research Methods, "Research methods are the building blocks of the scientific enterprise. They are the "how" for building systematic knowledge.

  10. Types of Data Analysis: A Guide

    The different types of data analysis include descriptive, diagnostic, exploratory, inferential, predictive, causal, mechanistic and prescriptive. ... Benedict Neo is an undergraduate research assistant at Iowa State University, and has experience in computer science and statistics. He has previously worked as a data science intern for Tesla.

  11. A Practical Guide to Writing Quantitative and Qualitative Research

    A research question is what a study aims to answer after data analysis and interpretation. The answer is written in length in the discussion section of the paper. ... Research questions and hypotheses are crucial components to any type of research, whether quantitative or qualitative. These questions should be developed at the very beginning of ...

  12. Types of Research Designs Compared

    Types of Research Designs Compared | Guide & Examples. Published on June 20, 2019 by Shona McCombes.Revised on June 22, 2023. When you start planning a research project, developing research questions and creating a research design, you will have to make various decisions about the type of research you want to do.. There are many ways to categorize different types of research.

  13. Qualitative Data Analysis Methods: Top 6 + Examples

    QDA Method #1: Qualitative Content Analysis. Content analysis is possibly the most common and straightforward QDA method. At the simplest level, content analysis is used to evaluate patterns within a piece of content (for example, words, phrases or images) or across multiple pieces of content or sources of communication. For example, a collection of newspaper articles or political speeches.

  14. Types of Research

    This type of research is subdivided into two types: Technological applied research: looks towards improving efficiency in a particular productive sector through the improvement of processes or machinery related to said productive processes. Scientific applied research: has predictive purposes. Through this type of research design, we can ...

  15. Introduction to Research Statistical Analysis: An Overview of the

    Introduction. Statistical analysis is necessary for any research project seeking to make quantitative conclusions. The following is a primer for research-based statistical analysis. It is intended to be a high-level overview of appropriate statistical testing, while not diving too deep into any specific methodology.

  16. Research Methods

    Quantitative research methods are used to collect and analyze numerical data. This type of research is useful when the objective is to test a hypothesis, determine cause-and-effect relationships, and measure the prevalence of certain phenomena. Quantitative research methods include surveys, experiments, and secondary data analysis.

  17. 8 Types of Analysis in Research

    This analysis used by physical and engineering science in case of the deterministic set of equations. The applications of this type of analysis are randomized trial data set. 8) Evolutionary programming. It combines different types of analysis in research using evolutionary algorithms to form meaningful data and is a very common concept in data ...

  18. Quantitative Research

    Quantitative Research. Quantitative research is a type of research that collects and analyzes numerical data to test hypotheses and answer research questions.This research typically involves a large sample size and uses statistical analysis to make inferences about a population based on the data collected.

  19. Introduction to systematic review and meta-analysis

    It is easy to confuse systematic reviews and meta-analyses. A systematic review is an objective, reproducible method to find answers to a certain research question, by collecting all available studies related to that question and reviewing and analyzing their results. A meta-analysis differs from a systematic review in that it uses statistical ...

  20. The Beginner's Guide to Statistical Analysis

    Statistical analysis means investigating trends, patterns, and relationships using quantitative data. It is an important research tool used by scientists, governments, businesses, and other organizations. To draw valid conclusions, statistical analysis requires careful planning from the very start of the research process. You need to specify ...

  21. The 8 types of market research and how to use them

    Conception: The moment you're thinking about adding something new, market research can find market opportunities and provide insights into customer challenges or their jobs-to-be-done, so you can find a way to fill the gap. Formation: Once you have an idea, market researchers can help you turn it into a concept that can be tested. You can learn more about strategizing pricing, testing ...

  22. Patient medication management, understanding and adherence during the

    Study design. This qualitative longitudinal study, conducted from October 2020 to July 2021, used a qualitative descriptive methodology through four consecutive in-depth semi-structured interviews per participant at three, 10-, 30- and 60-days post-discharge, as illustrated in Fig. 1.Longitudinal qualitative research is characterized by qualitative data collection at different points in time ...

  23. Qualitative Research

    Qualitative Research. Qualitative research is a type of research methodology that focuses on exploring and understanding people's beliefs, attitudes, behaviors, and experiences through the collection and analysis of non-numerical data. It seeks to answer research questions through the examination of subjective data, such as interviews, focus groups, observations, and textual analysis.

  24. Transcriptomic analysis-guided assessment of precision-cut ...

    This is likely tumor-type dependent, as others have reported that PCTS from other tumor types (murine tumors, colorectal, non-small cell lung cancer and liver tumors) were viable for up to 12 days ...

  25. One Year In: Tracking the Impacts of NEM 3.0 on California's

    EAEI researchers provide unbiased, scientific, high quality and innovative research and technical assistance to government agencies in the United States and throughout the world. We help these entities develop long-term strategies, policies and programs that facilitate energy efficiency and the deployment of clean energy technology with minimum ...

  26. What Is Qualitative Research?

    Qualitative research involves collecting and analyzing non-numerical data (e.g., text, video, or audio) to understand concepts, opinions, or experiences. It can be used to gather in-depth insights into a problem or generate new ideas for research. Qualitative research is the opposite of quantitative research, which involves collecting and ...

  27. Impact of the use of cannabis as a medicine in pregnancy, on the unborn

    The statistical analysis will combine three or more outcomes that are reported in a consistent manner. ... Records will be excluded based on title and abstract screening as well as publication type. Where initial disagreement exists between reviewers regarding the inclusion of a study, team members will review disputed articles status until ...

  28. Research Methodology

    Qualitative Research Methodology. This is a research methodology that involves the collection and analysis of non-numerical data such as words, images, and observations. This type of research is often used to explore complex phenomena, to gain an in-depth understanding of a particular topic, and to generate hypotheses.

  29. How Much U.S. Aid Is Going to Ukraine?

    Research & Analysis. Featured. Middle East and North Africa The End of Ambition. Following a long series of catastrophic misadventures in the Middle East over the last two decades, the American ...

  30. Global Microarray Industry Research 2023-2024 & 2033: Novel

    Dublin, May 15, 2024 (GLOBE NEWSWIRE) -- The "Global Microarray Analysis Market - A Global and Regional Analysis: Focus on Product, Type, Application, End User, Region, Country Level Analysis and ...