Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, automatically generate references for free.

  • Knowledge Base
  • Methodology

Research Design | Step-by-Step Guide with Examples

Published on 5 May 2022 by Shona McCombes . Revised on 20 March 2023.

A research design is a strategy for answering your research question  using empirical data. Creating a research design means making decisions about:

  • Your overall aims and approach
  • The type of research design you’ll use
  • Your sampling methods or criteria for selecting subjects
  • Your data collection methods
  • The procedures you’ll follow to collect data
  • Your data analysis methods

A well-planned research design helps ensure that your methods match your research aims and that you use the right kind of analysis for your data.

Table of contents

Step 1: consider your aims and approach, step 2: choose a type of research design, step 3: identify your population and sampling method, step 4: choose your data collection methods, step 5: plan your data collection procedures, step 6: decide on your data analysis strategies, frequently asked questions.

  • Introduction

Before you can start designing your research, you should already have a clear idea of the research question you want to investigate.

There are many different ways you could go about answering this question. Your research design choices should be driven by your aims and priorities – start by thinking carefully about what you want to achieve.

The first choice you need to make is whether you’ll take a qualitative or quantitative approach.

Qualitative research designs tend to be more flexible and inductive , allowing you to adjust your approach based on what you find throughout the research process.

Quantitative research designs tend to be more fixed and deductive , with variables and hypotheses clearly defined in advance of data collection.

It’s also possible to use a mixed methods design that integrates aspects of both approaches. By combining qualitative and quantitative insights, you can gain a more complete picture of the problem you’re studying and strengthen the credibility of your conclusions.

Practical and ethical considerations when designing research

As well as scientific considerations, you need to think practically when designing your research. If your research involves people or animals, you also need to consider research ethics .

  • How much time do you have to collect data and write up the research?
  • Will you be able to gain access to the data you need (e.g., by travelling to a specific location or contacting specific people)?
  • Do you have the necessary research skills (e.g., statistical analysis or interview techniques)?
  • Will you need ethical approval ?

At each stage of the research design process, make sure that your choices are practically feasible.

Prevent plagiarism, run a free check.

Within both qualitative and quantitative approaches, there are several types of research design to choose from. Each type provides a framework for the overall shape of your research.

Types of quantitative research designs

Quantitative designs can be split into four main types. Experimental and   quasi-experimental designs allow you to test cause-and-effect relationships, while descriptive and correlational designs allow you to measure variables and describe relationships between them.

With descriptive and correlational designs, you can get a clear picture of characteristics, trends, and relationships as they exist in the real world. However, you can’t draw conclusions about cause and effect (because correlation doesn’t imply causation ).

Experiments are the strongest way to test cause-and-effect relationships without the risk of other variables influencing the results. However, their controlled conditions may not always reflect how things work in the real world. They’re often also more difficult and expensive to implement.

Types of qualitative research designs

Qualitative designs are less strictly defined. This approach is about gaining a rich, detailed understanding of a specific context or phenomenon, and you can often be more creative and flexible in designing your research.

The table below shows some common types of qualitative design. They often have similar approaches in terms of data collection, but focus on different aspects when analysing the data.

Your research design should clearly define who or what your research will focus on, and how you’ll go about choosing your participants or subjects.

In research, a population is the entire group that you want to draw conclusions about, while a sample is the smaller group of individuals you’ll actually collect data from.

Defining the population

A population can be made up of anything you want to study – plants, animals, organisations, texts, countries, etc. In the social sciences, it most often refers to a group of people.

For example, will you focus on people from a specific demographic, region, or background? Are you interested in people with a certain job or medical condition, or users of a particular product?

The more precisely you define your population, the easier it will be to gather a representative sample.

Sampling methods

Even with a narrowly defined population, it’s rarely possible to collect data from every individual. Instead, you’ll collect data from a sample.

To select a sample, there are two main approaches: probability sampling and non-probability sampling . The sampling method you use affects how confidently you can generalise your results to the population as a whole.

Probability sampling is the most statistically valid option, but it’s often difficult to achieve unless you’re dealing with a very small and accessible population.

For practical reasons, many studies use non-probability sampling, but it’s important to be aware of the limitations and carefully consider potential biases. You should always make an effort to gather a sample that’s as representative as possible of the population.

Case selection in qualitative research

In some types of qualitative designs, sampling may not be relevant.

For example, in an ethnography or a case study, your aim is to deeply understand a specific context, not to generalise to a population. Instead of sampling, you may simply aim to collect as much data as possible about the context you are studying.

In these types of design, you still have to carefully consider your choice of case or community. You should have a clear rationale for why this particular case is suitable for answering your research question.

For example, you might choose a case study that reveals an unusual or neglected aspect of your research problem, or you might choose several very similar or very different cases in order to compare them.

Data collection methods are ways of directly measuring variables and gathering information. They allow you to gain first-hand knowledge and original insights into your research problem.

You can choose just one data collection method, or use several methods in the same study.

Survey methods

Surveys allow you to collect data about opinions, behaviours, experiences, and characteristics by asking people directly. There are two main survey methods to choose from: questionnaires and interviews.

Observation methods

Observations allow you to collect data unobtrusively, observing characteristics, behaviours, or social interactions without relying on self-reporting.

Observations may be conducted in real time, taking notes as you observe, or you might make audiovisual recordings for later analysis. They can be qualitative or quantitative.

Other methods of data collection

There are many other ways you might collect data depending on your field and topic.

If you’re not sure which methods will work best for your research design, try reading some papers in your field to see what data collection methods they used.

Secondary data

If you don’t have the time or resources to collect data from the population you’re interested in, you can also choose to use secondary data that other researchers already collected – for example, datasets from government surveys or previous studies on your topic.

With this raw data, you can do your own analysis to answer new research questions that weren’t addressed by the original study.

Using secondary data can expand the scope of your research, as you may be able to access much larger and more varied samples than you could collect yourself.

However, it also means you don’t have any control over which variables to measure or how to measure them, so the conclusions you can draw may be limited.

As well as deciding on your methods, you need to plan exactly how you’ll use these methods to collect data that’s consistent, accurate, and unbiased.

Planning systematic procedures is especially important in quantitative research, where you need to precisely define your variables and ensure your measurements are reliable and valid.

Operationalisation

Some variables, like height or age, are easily measured. But often you’ll be dealing with more abstract concepts, like satisfaction, anxiety, or competence. Operationalisation means turning these fuzzy ideas into measurable indicators.

If you’re using observations , which events or actions will you count?

If you’re using surveys , which questions will you ask and what range of responses will be offered?

You may also choose to use or adapt existing materials designed to measure the concept you’re interested in – for example, questionnaires or inventories whose reliability and validity has already been established.

Reliability and validity

Reliability means your results can be consistently reproduced , while validity means that you’re actually measuring the concept you’re interested in.

For valid and reliable results, your measurement materials should be thoroughly researched and carefully designed. Plan your procedures to make sure you carry out the same steps in the same way for each participant.

If you’re developing a new questionnaire or other instrument to measure a specific concept, running a pilot study allows you to check its validity and reliability in advance.

Sampling procedures

As well as choosing an appropriate sampling method, you need a concrete plan for how you’ll actually contact and recruit your selected sample.

That means making decisions about things like:

  • How many participants do you need for an adequate sample size?
  • What inclusion and exclusion criteria will you use to identify eligible participants?
  • How will you contact your sample – by mail, online, by phone, or in person?

If you’re using a probability sampling method, it’s important that everyone who is randomly selected actually participates in the study. How will you ensure a high response rate?

If you’re using a non-probability method, how will you avoid bias and ensure a representative sample?

Data management

It’s also important to create a data management plan for organising and storing your data.

Will you need to transcribe interviews or perform data entry for observations? You should anonymise and safeguard any sensitive data, and make sure it’s backed up regularly.

Keeping your data well organised will save time when it comes to analysing them. It can also help other researchers validate and add to your findings.

On their own, raw data can’t answer your research question. The last step of designing your research is planning how you’ll analyse the data.

Quantitative data analysis

In quantitative research, you’ll most likely use some form of statistical analysis . With statistics, you can summarise your sample data, make estimates, and test hypotheses.

Using descriptive statistics , you can summarise your sample data in terms of:

  • The distribution of the data (e.g., the frequency of each score on a test)
  • The central tendency of the data (e.g., the mean to describe the average score)
  • The variability of the data (e.g., the standard deviation to describe how spread out the scores are)

The specific calculations you can do depend on the level of measurement of your variables.

Using inferential statistics , you can:

  • Make estimates about the population based on your sample data.
  • Test hypotheses about a relationship between variables.

Regression and correlation tests look for associations between two or more variables, while comparison tests (such as t tests and ANOVAs ) look for differences in the outcomes of different groups.

Your choice of statistical test depends on various aspects of your research design, including the types of variables you’re dealing with and the distribution of your data.

Qualitative data analysis

In qualitative research, your data will usually be very dense with information and ideas. Instead of summing it up in numbers, you’ll need to comb through the data in detail, interpret its meanings, identify patterns, and extract the parts that are most relevant to your research question.

Two of the most common approaches to doing this are thematic analysis and discourse analysis .

There are many other ways of analysing qualitative data depending on the aims of your research. To get a sense of potential approaches, try reading some qualitative research papers in your field.

A sample is a subset of individuals from a larger population. Sampling means selecting the group that you will actually collect data from in your research.

For example, if you are researching the opinions of students in your university, you could survey a sample of 100 students.

Statistical sampling allows you to test a hypothesis about the characteristics of a population. There are various sampling methods you can use to ensure that your sample is representative of the population as a whole.

Operationalisation means turning abstract conceptual ideas into measurable observations.

For example, the concept of social anxiety isn’t directly observable, but it can be operationally defined in terms of self-rating scores, behavioural avoidance of crowded places, or physical anxiety symptoms in social situations.

Before collecting data , it’s important to consider how you will operationalise the variables that you want to measure.

The research methods you use depend on the type of data you need to answer your research question .

  • If you want to measure something or test a hypothesis , use quantitative methods . If you want to explore ideas, thoughts, and meanings, use qualitative methods .
  • If you want to analyse a large amount of readily available data, use secondary data. If you want data specific to your purposes with control over how they are generated, collect primary data.
  • If you want to establish cause-and-effect relationships between variables , use experimental methods. If you want to understand the characteristics of a research subject, use descriptive methods.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the ‘Cite this Scribbr article’ button to automatically add the citation to our free Reference Generator.

McCombes, S. (2023, March 20). Research Design | Step-by-Step Guide with Examples. Scribbr. Retrieved 22 April 2024, from https://www.scribbr.co.uk/research-methods/research-design/

Is this article helpful?

Shona McCombes

Shona McCombes

  • University Libraries
  • Research Guides
  • Topic Guides
  • Research Methods Guide
  • Research Design & Method

Research Methods Guide: Research Design & Method

  • Introduction
  • Survey Research
  • Interview Research
  • Data Analysis
  • Resources & Consultation

Tutorial Videos: Research Design & Method

Research Methods (sociology-focused)

Qualitative vs. Quantitative Methods (intro)

Qualitative vs. Quantitative Methods (advanced)

project research design and methodology

FAQ: Research Design & Method

What is the difference between Research Design and Research Method?

Research design is a plan to answer your research question.  A research method is a strategy used to implement that plan.  Research design and methods are different but closely related, because good research design ensures that the data you obtain will help you answer your research question more effectively.

Which research method should I choose ?

It depends on your research goal.  It depends on what subjects (and who) you want to study.  Let's say you are interested in studying what makes people happy, or why some students are more conscious about recycling on campus.  To answer these questions, you need to make a decision about how to collect your data.  Most frequently used methods include:

  • Observation / Participant Observation
  • Focus Groups
  • Experiments
  • Secondary Data Analysis / Archival Study
  • Mixed Methods (combination of some of the above)

One particular method could be better suited to your research goal than others, because the data you collect from different methods will be different in quality and quantity.   For instance, surveys are usually designed to produce relatively short answers, rather than the extensive responses expected in qualitative interviews.

What other factors should I consider when choosing one method over another?

Time for data collection and analysis is something you want to consider.  An observation or interview method, so-called qualitative approach, helps you collect richer information, but it takes time.  Using a survey helps you collect more data quickly, yet it may lack details.  So, you will need to consider the time you have for research and the balance between strengths and weaknesses associated with each method (e.g., qualitative vs. quantitative).

  • << Previous: Introduction
  • Next: Survey Research >>
  • Last Updated: Aug 21, 2023 10:42 AM

Research Methodologies

  • Quantitative Research Methodologies
  • Qualitative Research Methodologies
  • Systematic Reviews
  • Finding Articles by Methodology
  • Design Your Research Project

Library Help

Getting started.

Occasionally, you may be asked to collect original data and analyze it for a class project. As you develop a research question and select a methodology, ask yourself:

  • What are the assignment criteria? 
  • What data do I need to answer my research question? Are quantitative or qualitative methods better suited to gathering data to answer the research question?
  • Are my collection methods ethical? (See information about UVU's Institutional Review Board below.)

If you're not sure where to start, the following books from the Fulton Library's collection can help you select a methodology and begin work on your research project.

Cover Art

Additional resources can be found on our guides for research methods courses:

  • BESC 3020: Research Methods for the Behavioral Sciences
  • COMM 3020: Communication Research Methods
  • PSY 3030: Research Methods for Psychology
  • SOC 3030: Social Research Methods

Institutional Review Board

For information on getting a research study with human subjects formally approved by the university, see the website for UVU's Institutional Review Board. IRB approval often isn't required for class research assignments, but their website may be helpful depending on the parameters of your project or if you decide to fully implement your project.

  • UVU's Institutional Review Board (IRB) The Institutional Review Board (IRB) is a committee mandated by federal law to protect the rights and welfare of human subjects participating in research activities. Their website includes information about CITI Training and the research project approval process.

Psychological Tests & Measures

Psychological tests (also known as measures, instruments, or scales) are standardized measures of a particular psychological variable such as personality or emotional functioning. Behavioral science research often involves using psychological tests, whether to simply learn about the test or to use in your own research study.

  • Many tests are commercially published and subject to copyright and licensing restrictions—these typically require paying a fee to use and are not available through the Fulton Library . Publishers may require proof that users have the professional credentials to administer the test. If you find a commercially published test you want to use, look for an FAQ page to see if a version is available to student researchers for free or at a reduced cost, or try contacting the publisher to ask.
  • In addition to commercial tests, there are countless unpublished tests that researchers design for particular studies in psychology, education, and other fields. These tests may be freely available online or in scholarly articles.

Many psychological tests—both unpublished and commercially published—are described in scholarly articles, including how they were developed and validated, and how they can be used in the field. If an article refers to a specific psychological test that you'd like to view or use in your research, try these strategies for finding it:

  • Check the appendixes at the end of the article to see if the author(s) included the full instrument or a note about where it can be found or purchased. Portions of the test may also be included in the main text of the article or in tables or figures.
  • Search for the name of the test on Google as a quick way to see if it's available through a commercial publisher or freely available on a website.
  • Email the article's lead author, to ask questions about their test or ask if they have a version they're willing to share with student researchers. Author contact information is typically included on the first page of scholarly articles. If it's an older article, the contact information may be out of date—try searching for the author on Google, to see where they're currently working and look for a current email.
  • FAQ: Finding Information About Psychological Tests (American Psychological Association) This excellent FAQ page from the American Psychological Association includes more information about psychological tests and how to find them.

Databases for Tests & Measures

Use these databases to find articles that reference specific psychological tests and measures. You can either use a search term to look for tests on a topic (for example, search for "metacognition") or you can search for the name of a test you already know about (for example, search for "Metacognition Assessment Scale").

Please be aware, articles found in these databases frequently do not include full tests—once you find information on a test you'd like to see, use the tips above to see if you can locate the full test. Also these databases often do not include the full text of articles. In these cases, you will need to use citation details to locate articles in another database like OneSearch. Contact your librarian if you need any help using these databases.

  • HaPI (Health and Psychosocial Instruments) This link opens in a new window This database specializes in psychological instruments and is a great place to start. Search using keywords related to your topic, or by the name of a specific instrument. You will see a list of instruments related to your search. After clicking a title, you will see a record that includes information about the instrument, as well as a Source line with citation details for the article that referenced the instrument (the article may also be included as a PDF). By Behavioral Measurement Database Services; partial full text available.
  • Mental Measurements Yearbook with Tests in Print This link opens in a new window A guide to more than 2,000 contemporary testing instruments. The MMY series contains information on test products within such diverse areas as psychology, education, business, and leadership. No full text available.
  • APA PsycInfo This link opens in a new window While this database includes all types of articles related to psychology, you can also use it to find articles that reference psychological tests and measures. To do this, scroll to the bottom of the database's search page and check the Tests/Measures/Assessment box before running your search, or use the Tests & Measures filter on the left side of your results list.
  • Call : 801.863.8840
  • Text : 801.290.8123
  • In-Person Help
  • Email a Librarian
  • Make an Appointment
  • << Previous: Finding Articles by Methodology
  • Next: Library Help >>
  • Last Updated: Apr 16, 2024 5:16 PM
  • URL: https://uvu.libguides.com/methods
  • Privacy Policy

Research Method

Home » Research Project – Definition, Writing Guide and Ideas

Research Project – Definition, Writing Guide and Ideas

Table of Contents

Research Project

Research Project

Definition :

Research Project is a planned and systematic investigation into a specific area of interest or problem, with the goal of generating new knowledge, insights, or solutions. It typically involves identifying a research question or hypothesis, designing a study to test it, collecting and analyzing data, and drawing conclusions based on the findings.

Types of Research Project

Types of Research Projects are as follows:

Basic Research

This type of research focuses on advancing knowledge and understanding of a subject area or phenomenon, without any specific application or practical use in mind. The primary goal is to expand scientific or theoretical knowledge in a particular field.

Applied Research

Applied research is aimed at solving practical problems or addressing specific issues. This type of research seeks to develop solutions or improve existing products, services or processes.

Action Research

Action research is conducted by practitioners and aimed at solving specific problems or improving practices in a particular context. It involves collaboration between researchers and practitioners, and often involves iterative cycles of data collection and analysis, with the goal of improving practices.

Quantitative Research

This type of research uses numerical data to investigate relationships between variables or to test hypotheses. It typically involves large-scale data collection through surveys, experiments, or secondary data analysis.

Qualitative Research

Qualitative research focuses on understanding and interpreting phenomena from the perspective of the people involved. It involves collecting and analyzing data in the form of text, images, or other non-numerical forms.

Mixed Methods Research

Mixed methods research combines elements of both quantitative and qualitative research, using multiple data sources and methods to gain a more comprehensive understanding of a phenomenon.

Longitudinal Research

This type of research involves studying a group of individuals or phenomena over an extended period of time, often years or decades. It is useful for understanding changes and developments over time.

Case Study Research

Case study research involves in-depth investigation of a particular case or phenomenon, often within a specific context. It is useful for understanding complex phenomena in their real-life settings.

Participatory Research

Participatory research involves active involvement of the people or communities being studied in the research process. It emphasizes collaboration, empowerment, and the co-production of knowledge.

Research Project Methodology

Research Project Methodology refers to the process of conducting research in an organized and systematic manner to answer a specific research question or to test a hypothesis. A well-designed research project methodology ensures that the research is rigorous, valid, and reliable, and that the findings are meaningful and can be used to inform decision-making.

There are several steps involved in research project methodology, which are described below:

Define the Research Question

The first step in any research project is to clearly define the research question or problem. This involves identifying the purpose of the research, the scope of the research, and the key variables that will be studied.

Develop a Research Plan

Once the research question has been defined, the next step is to develop a research plan. This plan outlines the methodology that will be used to collect and analyze data, including the research design, sampling strategy, data collection methods, and data analysis techniques.

Collect Data

The data collection phase involves gathering information through various methods, such as surveys, interviews, observations, experiments, or secondary data analysis. The data collected should be relevant to the research question and should be of sufficient quantity and quality to enable meaningful analysis.

Analyze Data

Once the data has been collected, it is analyzed using appropriate statistical techniques or other methods. The analysis should be guided by the research question and should aim to identify patterns, trends, relationships, or other insights that can inform the research findings.

Interpret and Report Findings

The final step in the research project methodology is to interpret the findings and report them in a clear and concise manner. This involves summarizing the results, discussing their implications, and drawing conclusions that can be used to inform decision-making.

Research Project Writing Guide

Here are some guidelines to help you in writing a successful research project:

  • Choose a topic: Choose a topic that you are interested in and that is relevant to your field of study. It is important to choose a topic that is specific and focused enough to allow for in-depth research and analysis.
  • Conduct a literature review : Conduct a thorough review of the existing research on your topic. This will help you to identify gaps in the literature and to develop a research question or hypothesis.
  • Develop a research question or hypothesis : Based on your literature review, develop a clear research question or hypothesis that you will investigate in your study.
  • Design your study: Choose an appropriate research design and methodology to answer your research question or test your hypothesis. This may include choosing a sample, selecting measures or instruments, and determining data collection methods.
  • Collect data: Collect data using your chosen methods and instruments. Be sure to follow ethical guidelines and obtain informed consent from participants if necessary.
  • Analyze data: Analyze your data using appropriate statistical or qualitative methods. Be sure to clearly report your findings and provide interpretations based on your research question or hypothesis.
  • Discuss your findings : Discuss your findings in the context of the existing literature and your research question or hypothesis. Identify any limitations or implications of your study and suggest directions for future research.
  • Write your project: Write your research project in a clear and organized manner, following the appropriate format and style guidelines for your field of study. Be sure to include an introduction, literature review, methodology, results, discussion, and conclusion.
  • Revise and edit: Revise and edit your project for clarity, coherence, and accuracy. Be sure to proofread for spelling, grammar, and formatting errors.
  • Cite your sources: Cite your sources accurately and appropriately using the appropriate citation style for your field of study.

Examples of Research Projects

Some Examples of Research Projects are as follows:

  • Investigating the effects of a new medication on patients with a particular disease or condition.
  • Exploring the impact of exercise on mental health and well-being.
  • Studying the effectiveness of a new teaching method in improving student learning outcomes.
  • Examining the impact of social media on political participation and engagement.
  • Investigating the efficacy of a new therapy for a specific mental health disorder.
  • Exploring the use of renewable energy sources in reducing carbon emissions and mitigating climate change.
  • Studying the effects of a new agricultural technique on crop yields and environmental sustainability.
  • Investigating the effectiveness of a new technology in improving business productivity and efficiency.
  • Examining the impact of a new public policy on social inequality and access to resources.
  • Exploring the factors that influence consumer behavior in a specific market.

Characteristics of Research Project

Here are some of the characteristics that are often associated with research projects:

  • Clear objective: A research project is designed to answer a specific question or solve a particular problem. The objective of the research should be clearly defined from the outset.
  • Systematic approach: A research project is typically carried out using a structured and systematic approach that involves careful planning, data collection, analysis, and interpretation.
  • Rigorous methodology: A research project should employ a rigorous methodology that is appropriate for the research question being investigated. This may involve the use of statistical analysis, surveys, experiments, or other methods.
  • Data collection : A research project involves collecting data from a variety of sources, including primary sources (such as surveys or experiments) and secondary sources (such as published literature or databases).
  • Analysis and interpretation : Once the data has been collected, it needs to be analyzed and interpreted. This involves using statistical techniques or other methods to identify patterns or relationships in the data.
  • Conclusion and implications : A research project should lead to a clear conclusion that answers the research question. It should also identify the implications of the findings for future research or practice.
  • Communication: The results of the research project should be communicated clearly and effectively, using appropriate language and visual aids, to a range of audiences, including peers, stakeholders, and the wider public.

Importance of Research Project

Research projects are an essential part of the process of generating new knowledge and advancing our understanding of various fields of study. Here are some of the key reasons why research projects are important:

  • Advancing knowledge : Research projects are designed to generate new knowledge and insights into particular topics or questions. This knowledge can be used to inform policies, practices, and decision-making processes across a range of fields.
  • Solving problems: Research projects can help to identify solutions to real-world problems by providing a better understanding of the causes and effects of particular issues.
  • Developing new technologies: Research projects can lead to the development of new technologies or products that can improve people’s lives or address societal challenges.
  • Improving health outcomes: Research projects can contribute to improving health outcomes by identifying new treatments, diagnostic tools, or preventive strategies.
  • Enhancing education: Research projects can enhance education by providing new insights into teaching and learning methods, curriculum development, and student learning outcomes.
  • Informing public policy : Research projects can inform public policy by providing evidence-based recommendations and guidance on issues related to health, education, environment, social justice, and other areas.
  • Enhancing professional development : Research projects can enhance the professional development of researchers by providing opportunities to develop new skills, collaborate with colleagues, and share knowledge with others.

Research Project Ideas

Following are some Research Project Ideas:

Field: Psychology

  • Investigating the impact of social support on coping strategies among individuals with chronic illnesses.
  • Exploring the relationship between childhood trauma and adult attachment styles.
  • Examining the effects of exercise on cognitive function and brain health in older adults.
  • Investigating the impact of sleep deprivation on decision making and risk-taking behavior.
  • Exploring the relationship between personality traits and leadership styles in the workplace.
  • Examining the effectiveness of cognitive-behavioral therapy (CBT) for treating anxiety disorders.
  • Investigating the relationship between social comparison and body dissatisfaction in young women.
  • Exploring the impact of parenting styles on children’s emotional regulation and behavior.
  • Investigating the effectiveness of mindfulness-based interventions for treating depression.
  • Examining the relationship between childhood adversity and later-life health outcomes.

Field: Economics

  • Analyzing the impact of trade agreements on economic growth in developing countries.
  • Examining the effects of tax policy on income distribution and poverty reduction.
  • Investigating the relationship between foreign aid and economic development in low-income countries.
  • Exploring the impact of globalization on labor markets and job displacement.
  • Analyzing the impact of minimum wage laws on employment and income levels.
  • Investigating the effectiveness of monetary policy in managing inflation and unemployment.
  • Examining the relationship between economic freedom and entrepreneurship.
  • Analyzing the impact of income inequality on social mobility and economic opportunity.
  • Investigating the role of education in economic development.
  • Examining the effectiveness of different healthcare financing systems in promoting health equity.

Field: Sociology

  • Investigating the impact of social media on political polarization and civic engagement.
  • Examining the effects of neighborhood characteristics on health outcomes.
  • Analyzing the impact of immigration policies on social integration and cultural diversity.
  • Investigating the relationship between social support and mental health outcomes in older adults.
  • Exploring the impact of income inequality on social cohesion and trust.
  • Analyzing the effects of gender and race discrimination on career advancement and pay equity.
  • Investigating the relationship between social networks and health behaviors.
  • Examining the effectiveness of community-based interventions for reducing crime and violence.
  • Analyzing the impact of social class on cultural consumption and taste.
  • Investigating the relationship between religious affiliation and social attitudes.

Field: Computer Science

  • Developing an algorithm for detecting fake news on social media.
  • Investigating the effectiveness of different machine learning algorithms for image recognition.
  • Developing a natural language processing tool for sentiment analysis of customer reviews.
  • Analyzing the security implications of blockchain technology for online transactions.
  • Investigating the effectiveness of different recommendation algorithms for personalized advertising.
  • Developing an artificial intelligence chatbot for mental health counseling.
  • Investigating the effectiveness of different algorithms for optimizing online advertising campaigns.
  • Developing a machine learning model for predicting consumer behavior in online marketplaces.
  • Analyzing the privacy implications of different data sharing policies for online platforms.
  • Investigating the effectiveness of different algorithms for predicting stock market trends.

Field: Education

  • Investigating the impact of teacher-student relationships on academic achievement.
  • Analyzing the effectiveness of different pedagogical approaches for promoting student engagement and motivation.
  • Examining the effects of school choice policies on academic achievement and social mobility.
  • Investigating the impact of technology on learning outcomes and academic achievement.
  • Analyzing the effects of school funding disparities on educational equity and achievement gaps.
  • Investigating the relationship between school climate and student mental health outcomes.
  • Examining the effectiveness of different teaching strategies for promoting critical thinking and problem-solving skills.
  • Investigating the impact of social-emotional learning programs on student behavior and academic achievement.
  • Analyzing the effects of standardized testing on student motivation and academic achievement.

Field: Environmental Science

  • Investigating the impact of climate change on species distribution and biodiversity.
  • Analyzing the effectiveness of different renewable energy technologies in reducing carbon emissions.
  • Examining the impact of air pollution on human health outcomes.
  • Investigating the relationship between urbanization and deforestation in developing countries.
  • Analyzing the effects of ocean acidification on marine ecosystems and biodiversity.
  • Investigating the impact of land use change on soil fertility and ecosystem services.
  • Analyzing the effectiveness of different conservation policies and programs for protecting endangered species and habitats.
  • Investigating the relationship between climate change and water resources in arid regions.
  • Examining the impact of plastic pollution on marine ecosystems and biodiversity.
  • Investigating the effects of different agricultural practices on soil health and nutrient cycling.

Field: Linguistics

  • Analyzing the impact of language diversity on social integration and cultural identity.
  • Investigating the relationship between language and cognition in bilingual individuals.
  • Examining the effects of language contact and language change on linguistic diversity.
  • Investigating the role of language in shaping cultural norms and values.
  • Analyzing the effectiveness of different language teaching methodologies for second language acquisition.
  • Investigating the relationship between language proficiency and academic achievement.
  • Examining the impact of language policy on language use and language attitudes.
  • Investigating the role of language in shaping gender and social identities.
  • Analyzing the effects of dialect contact on language variation and change.
  • Investigating the relationship between language and emotion expression.

Field: Political Science

  • Analyzing the impact of electoral systems on women’s political representation.
  • Investigating the relationship between political ideology and attitudes towards immigration.
  • Examining the effects of political polarization on democratic institutions and political stability.
  • Investigating the impact of social media on political participation and civic engagement.
  • Analyzing the effects of authoritarianism on human rights and civil liberties.
  • Investigating the relationship between public opinion and foreign policy decisions.
  • Examining the impact of international organizations on global governance and cooperation.
  • Investigating the effectiveness of different conflict resolution strategies in resolving ethnic and religious conflicts.
  • Analyzing the effects of corruption on economic development and political stability.
  • Investigating the role of international law in regulating global governance and human rights.

Field: Medicine

  • Investigating the impact of lifestyle factors on chronic disease risk and prevention.
  • Examining the effectiveness of different treatment approaches for mental health disorders.
  • Investigating the relationship between genetics and disease susceptibility.
  • Analyzing the effects of social determinants of health on health outcomes and health disparities.
  • Investigating the impact of different healthcare delivery models on patient outcomes and cost effectiveness.
  • Examining the effectiveness of different prevention and treatment strategies for infectious diseases.
  • Investigating the relationship between healthcare provider communication skills and patient satisfaction and outcomes.
  • Analyzing the effects of medical error and patient safety on healthcare quality and outcomes.
  • Investigating the impact of different pharmaceutical pricing policies on access to essential medicines.
  • Examining the effectiveness of different rehabilitation approaches for improving function and quality of life in individuals with disabilities.

Field: Anthropology

  • Analyzing the impact of colonialism on indigenous cultures and identities.
  • Investigating the relationship between cultural practices and health outcomes in different populations.
  • Examining the effects of globalization on cultural diversity and cultural exchange.
  • Investigating the role of language in cultural transmission and preservation.
  • Analyzing the effects of cultural contact on cultural change and adaptation.
  • Investigating the impact of different migration policies on immigrant integration and acculturation.
  • Examining the role of gender and sexuality in cultural norms and values.
  • Investigating the impact of cultural heritage preservation on tourism and economic development.
  • Analyzing the effects of cultural revitalization movements on indigenous communities.

About the author

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

Data collection

Data Collection – Methods Types and Examples

Delimitations

Delimitations in Research – Types, Examples and...

Research Process

Research Process – Steps, Examples and Tips

Research Design

Research Design – Types, Methods and Examples

Institutional Review Board (IRB)

Institutional Review Board – Application Sample...

Evaluating Research

Evaluating Research – Process, Examples and...

Grad Coach

How To Choose Your Research Methodology

Qualitative vs quantitative vs mixed methods.

By: Derek Jansen (MBA). Expert Reviewed By: Dr Eunice Rautenbach | June 2021

Without a doubt, one of the most common questions we receive at Grad Coach is “ How do I choose the right methodology for my research? ”. It’s easy to see why – with so many options on the research design table, it’s easy to get intimidated, especially with all the complex lingo!

In this post, we’ll explain the three overarching types of research – qualitative, quantitative and mixed methods – and how you can go about choosing the best methodological approach for your research.

Overview: Choosing Your Methodology

Understanding the options – Qualitative research – Quantitative research – Mixed methods-based research

Choosing a research methodology – Nature of the research – Research area norms – Practicalities

Free Webinar: Research Methodology 101

1. Understanding the options

Before we jump into the question of how to choose a research methodology, it’s useful to take a step back to understand the three overarching types of research – qualitative , quantitative and mixed methods -based research. Each of these options takes a different methodological approach.

Qualitative research utilises data that is not numbers-based. In other words, qualitative research focuses on words , descriptions , concepts or ideas – while quantitative research makes use of numbers and statistics. Qualitative research investigates the “softer side” of things to explore and describe, while quantitative research focuses on the “hard numbers”, to measure differences between variables and the relationships between them.

Importantly, qualitative research methods are typically used to explore and gain a deeper understanding of the complexity of a situation – to draw a rich picture . In contrast to this, quantitative methods are usually used to confirm or test hypotheses . In other words, they have distinctly different purposes. The table below highlights a few of the key differences between qualitative and quantitative research – you can learn more about the differences here.

  • Uses an inductive approach
  • Is used to build theories
  • Takes a subjective approach
  • Adopts an open and flexible approach
  • The researcher is close to the respondents
  • Interviews and focus groups are oftentimes used to collect word-based data.
  • Generally, draws on small sample sizes
  • Uses qualitative data analysis techniques (e.g. content analysis , thematic analysis , etc)
  • Uses a deductive approach
  • Is used to test theories
  • Takes an objective approach
  • Adopts a closed, highly planned approach
  • The research is disconnected from respondents
  • Surveys or laboratory equipment are often used to collect number-based data.
  • Generally, requires large sample sizes
  • Uses statistical analysis techniques to make sense of the data

Mixed methods -based research, as you’d expect, attempts to bring these two types of research together, drawing on both qualitative and quantitative data. Quite often, mixed methods-based studies will use qualitative research to explore a situation and develop a potential model of understanding (this is called a conceptual framework), and then go on to use quantitative methods to test that model empirically.

In other words, while qualitative and quantitative methods (and the philosophies that underpin them) are completely different, they are not at odds with each other. It’s not a competition of qualitative vs quantitative. On the contrary, they can be used together to develop a high-quality piece of research. Of course, this is easier said than done, so we usually recommend that first-time researchers stick to a single approach , unless the nature of their study truly warrants a mixed-methods approach.

The key takeaway here, and the reason we started by looking at the three options, is that it’s important to understand that each methodological approach has a different purpose – for example, to explore and understand situations (qualitative), to test and measure (quantitative) or to do both. They’re not simply alternative tools for the same job. 

Right – now that we’ve got that out of the way, let’s look at how you can go about choosing the right methodology for your research.

Methodology choices in research

2. How to choose a research methodology

To choose the right research methodology for your dissertation or thesis, you need to consider three important factors . Based on these three factors, you can decide on your overarching approach – qualitative, quantitative or mixed methods. Once you’ve made that decision, you can flesh out the finer details of your methodology, such as the sampling , data collection methods and analysis techniques (we discuss these separately in other posts ).

The three factors you need to consider are:

  • The nature of your research aims, objectives and research questions
  • The methodological approaches taken in the existing literature
  • Practicalities and constraints

Let’s take a look at each of these.

Factor #1: The nature of your research

As I mentioned earlier, each type of research (and therefore, research methodology), whether qualitative, quantitative or mixed, has a different purpose and helps solve a different type of question. So, it’s logical that the key deciding factor in terms of which research methodology you adopt is the nature of your research aims, objectives and research questions .

But, what types of research exist?

Broadly speaking, research can fall into one of three categories:

  • Exploratory – getting a better understanding of an issue and potentially developing a theory regarding it
  • Confirmatory – confirming a potential theory or hypothesis by testing it empirically
  • A mix of both – building a potential theory or hypothesis and then testing it

As a rule of thumb, exploratory research tends to adopt a qualitative approach , whereas confirmatory research tends to use quantitative methods . This isn’t set in stone, but it’s a very useful heuristic. Naturally then, research that combines a mix of both, or is seeking to develop a theory from the ground up and then test that theory, would utilize a mixed-methods approach.

Exploratory vs confirmatory research

Let’s look at an example in action.

If your research aims were to understand the perspectives of war veterans regarding certain political matters, you’d likely adopt a qualitative methodology, making use of interviews to collect data and one or more qualitative data analysis methods to make sense of the data.

If, on the other hand, your research aims involved testing a set of hypotheses regarding the link between political leaning and income levels, you’d likely adopt a quantitative methodology, using numbers-based data from a survey to measure the links between variables and/or constructs .

So, the first (and most important thing) thing you need to consider when deciding which methodological approach to use for your research project is the nature of your research aims , objectives and research questions. Specifically, you need to assess whether your research leans in an exploratory or confirmatory direction or involves a mix of both.

The importance of achieving solid alignment between these three factors and your methodology can’t be overstated. If they’re misaligned, you’re going to be forcing a square peg into a round hole. In other words, you’ll be using the wrong tool for the job, and your research will become a disjointed mess.

If your research is a mix of both exploratory and confirmatory, but you have a tight word count limit, you may need to consider trimming down the scope a little and focusing on one or the other. One methodology executed well has a far better chance of earning marks than a poorly executed mixed methods approach. So, don’t try to be a hero, unless there is a very strong underpinning logic.

Need a helping hand?

project research design and methodology

Factor #2: The disciplinary norms

Choosing the right methodology for your research also involves looking at the approaches used by other researchers in the field, and studies with similar research aims and objectives to yours. Oftentimes, within a discipline, there is a common methodological approach (or set of approaches) used in studies. While this doesn’t mean you should follow the herd “just because”, you should at least consider these approaches and evaluate their merit within your context.

A major benefit of reviewing the research methodologies used by similar studies in your field is that you can often piggyback on the data collection techniques that other (more experienced) researchers have developed. For example, if you’re undertaking a quantitative study, you can often find tried and tested survey scales with high Cronbach’s alphas. These are usually included in the appendices of journal articles, so you don’t even have to contact the original authors. By using these, you’ll save a lot of time and ensure that your study stands on the proverbial “shoulders of giants” by using high-quality measurement instruments .

Of course, when reviewing existing literature, keep point #1 front of mind. In other words, your methodology needs to align with your research aims, objectives and questions. Don’t fall into the trap of adopting the methodological “norm” of other studies just because it’s popular. Only adopt that which is relevant to your research.

Factor #3: Practicalities

When choosing a research methodology, there will always be a tension between doing what’s theoretically best (i.e., the most scientifically rigorous research design ) and doing what’s practical , given your constraints . This is the nature of doing research and there are always trade-offs, as with anything else.

But what constraints, you ask?

When you’re evaluating your methodological options, you need to consider the following constraints:

  • Data access
  • Equipment and software
  • Your knowledge and skills

Let’s look at each of these.

Constraint #1: Data access

The first practical constraint you need to consider is your access to data . If you’re going to be undertaking primary research , you need to think critically about the sample of respondents you realistically have access to. For example, if you plan to use in-person interviews , you need to ask yourself how many people you’ll need to interview, whether they’ll be agreeable to being interviewed, where they’re located, and so on.

If you’re wanting to undertake a quantitative approach using surveys to collect data, you’ll need to consider how many responses you’ll require to achieve statistically significant results. For many statistical tests, a sample of a few hundred respondents is typically needed to develop convincing conclusions.

So, think carefully about what data you’ll need access to, how much data you’ll need and how you’ll collect it. The last thing you want is to spend a huge amount of time on your research only to find that you can’t get access to the required data.

Constraint #2: Time

The next constraint is time. If you’re undertaking research as part of a PhD, you may have a fairly open-ended time limit, but this is unlikely to be the case for undergrad and Masters-level projects. So, pay attention to your timeline, as the data collection and analysis components of different methodologies have a major impact on time requirements . Also, keep in mind that these stages of the research often take a lot longer than originally anticipated.

Another practical implication of time limits is that it will directly impact which time horizon you can use – i.e. longitudinal vs cross-sectional . For example, if you’ve got a 6-month limit for your entire research project, it’s quite unlikely that you’ll be able to adopt a longitudinal time horizon. 

Constraint #3: Money

As with so many things, money is another important constraint you’ll need to consider when deciding on your research methodology. While some research designs will cost near zero to execute, others may require a substantial budget .

Some of the costs that may arise include:

  • Software costs – e.g. survey hosting services, analysis software, etc.
  • Promotion costs – e.g. advertising a survey to attract respondents
  • Incentive costs – e.g. providing a prize or cash payment incentive to attract respondents
  • Equipment rental costs – e.g. recording equipment, lab equipment, etc.
  • Travel costs
  • Food & beverages

These are just a handful of costs that can creep into your research budget. Like most projects, the actual costs tend to be higher than the estimates, so be sure to err on the conservative side and expect the unexpected. It’s critically important that you’re honest with yourself about these costs, or you could end up getting stuck midway through your project because you’ve run out of money.

Budgeting for your research

Constraint #4: Equipment & software

Another practical consideration is the hardware and/or software you’ll need in order to undertake your research. Of course, this variable will depend on the type of data you’re collecting and analysing. For example, you may need lab equipment to analyse substances, or you may need specific analysis software to analyse statistical data. So, be sure to think about what hardware and/or software you’ll need for each potential methodological approach, and whether you have access to these.

Constraint #5: Your knowledge and skillset

The final practical constraint is a big one. Naturally, the research process involves a lot of learning and development along the way, so you will accrue knowledge and skills as you progress. However, when considering your methodological options, you should still consider your current position on the ladder.

Some of the questions you should ask yourself are:

  • Am I more of a “numbers person” or a “words person”?
  • How much do I know about the analysis methods I’ll potentially use (e.g. statistical analysis)?
  • How much do I know about the software and/or hardware that I’ll potentially use?
  • How excited am I to learn new research skills and gain new knowledge?
  • How much time do I have to learn the things I need to learn?

Answering these questions honestly will provide you with another set of criteria against which you can evaluate the research methodology options you’ve shortlisted.

So, as you can see, there is a wide range of practicalities and constraints that you need to take into account when you’re deciding on a research methodology. These practicalities create a tension between the “ideal” methodology and the methodology that you can realistically pull off. This is perfectly normal, and it’s your job to find the option that presents the best set of trade-offs.

Recap: Choosing a methodology

In this post, we’ve discussed how to go about choosing a research methodology. The three major deciding factors we looked at were:

  • Exploratory
  • Confirmatory
  • Combination
  • Research area norms
  • Hardware and software
  • Your knowledge and skillset

If you have any questions, feel free to leave a comment below. If you’d like a helping hand with your research methodology, check out our 1-on-1 research coaching service , or book a free consultation with a friendly Grad Coach.

project research design and methodology

Psst... there’s more!

This post was based on one of our popular Research Bootcamps . If you're working on a research project, you'll definitely want to check this out ...

You Might Also Like:

Research methodology example

Very useful and informative especially for beginners

Goudi

Nice article! I’m a beginner in the field of cybersecurity research. I am a Telecom and Network Engineer and Also aiming for PhD scholarship.

Margaret Mutandwa

I find the article very informative especially for my decitation it has been helpful and an eye opener.

Anna N Namwandi

Hi I am Anna ,

I am a PHD candidate in the area of cyber security, maybe we can link up

Tut Gatluak Doar

The Examples shows by you, for sure they are really direct me and others to knows and practices the Research Design and prepration.

Tshepo Ngcobo

I found the post very informative and practical.

Joyce

I’m the process of constructing my research design and I want to know if the data analysis I plan to present in my thesis defense proposal possibly change especially after I gathered the data already.

Janine Grace Baldesco

Thank you so much this site is such a life saver. How I wish 1-1 coaching is available in our country but sadly it’s not.

Submit a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

  • Print Friendly

Get science-backed answers as you write with Paperpal's Research feature

What is Research Methodology? Definition, Types, and Examples

project research design and methodology

Research methodology 1,2 is a structured and scientific approach used to collect, analyze, and interpret quantitative or qualitative data to answer research questions or test hypotheses. A research methodology is like a plan for carrying out research and helps keep researchers on track by limiting the scope of the research. Several aspects must be considered before selecting an appropriate research methodology, such as research limitations and ethical concerns that may affect your research.

The research methodology section in a scientific paper describes the different methodological choices made, such as the data collection and analysis methods, and why these choices were selected. The reasons should explain why the methods chosen are the most appropriate to answer the research question. A good research methodology also helps ensure the reliability and validity of the research findings. There are three types of research methodology—quantitative, qualitative, and mixed-method, which can be chosen based on the research objectives.

What is research methodology ?

A research methodology describes the techniques and procedures used to identify and analyze information regarding a specific research topic. It is a process by which researchers design their study so that they can achieve their objectives using the selected research instruments. It includes all the important aspects of research, including research design, data collection methods, data analysis methods, and the overall framework within which the research is conducted. While these points can help you understand what is research methodology, you also need to know why it is important to pick the right methodology.

Why is research methodology important?

Having a good research methodology in place has the following advantages: 3

  • Helps other researchers who may want to replicate your research; the explanations will be of benefit to them.
  • You can easily answer any questions about your research if they arise at a later stage.
  • A research methodology provides a framework and guidelines for researchers to clearly define research questions, hypotheses, and objectives.
  • It helps researchers identify the most appropriate research design, sampling technique, and data collection and analysis methods.
  • A sound research methodology helps researchers ensure that their findings are valid and reliable and free from biases and errors.
  • It also helps ensure that ethical guidelines are followed while conducting research.
  • A good research methodology helps researchers in planning their research efficiently, by ensuring optimum usage of their time and resources.

Writing the methods section of a research paper? Let Paperpal help you achieve perfection

Types of research methodology.

There are three types of research methodology based on the type of research and the data required. 1

  • Quantitative research methodology focuses on measuring and testing numerical data. This approach is good for reaching a large number of people in a short amount of time. This type of research helps in testing the causal relationships between variables, making predictions, and generalizing results to wider populations.
  • Qualitative research methodology examines the opinions, behaviors, and experiences of people. It collects and analyzes words and textual data. This research methodology requires fewer participants but is still more time consuming because the time spent per participant is quite large. This method is used in exploratory research where the research problem being investigated is not clearly defined.
  • Mixed-method research methodology uses the characteristics of both quantitative and qualitative research methodologies in the same study. This method allows researchers to validate their findings, verify if the results observed using both methods are complementary, and explain any unexpected results obtained from one method by using the other method.

What are the types of sampling designs in research methodology?

Sampling 4 is an important part of a research methodology and involves selecting a representative sample of the population to conduct the study, making statistical inferences about them, and estimating the characteristics of the whole population based on these inferences. There are two types of sampling designs in research methodology—probability and nonprobability.

  • Probability sampling

In this type of sampling design, a sample is chosen from a larger population using some form of random selection, that is, every member of the population has an equal chance of being selected. The different types of probability sampling are:

  • Systematic —sample members are chosen at regular intervals. It requires selecting a starting point for the sample and sample size determination that can be repeated at regular intervals. This type of sampling method has a predefined range; hence, it is the least time consuming.
  • Stratified —researchers divide the population into smaller groups that don’t overlap but represent the entire population. While sampling, these groups can be organized, and then a sample can be drawn from each group separately.
  • Cluster —the population is divided into clusters based on demographic parameters like age, sex, location, etc.
  • Convenience —selects participants who are most easily accessible to researchers due to geographical proximity, availability at a particular time, etc.
  • Purposive —participants are selected at the researcher’s discretion. Researchers consider the purpose of the study and the understanding of the target audience.
  • Snowball —already selected participants use their social networks to refer the researcher to other potential participants.
  • Quota —while designing the study, the researchers decide how many people with which characteristics to include as participants. The characteristics help in choosing people most likely to provide insights into the subject.

What are data collection methods?

During research, data are collected using various methods depending on the research methodology being followed and the research methods being undertaken. Both qualitative and quantitative research have different data collection methods, as listed below.

Qualitative research 5

  • One-on-one interviews: Helps the interviewers understand a respondent’s subjective opinion and experience pertaining to a specific topic or event
  • Document study/literature review/record keeping: Researchers’ review of already existing written materials such as archives, annual reports, research articles, guidelines, policy documents, etc.
  • Focus groups: Constructive discussions that usually include a small sample of about 6-10 people and a moderator, to understand the participants’ opinion on a given topic.
  • Qualitative observation : Researchers collect data using their five senses (sight, smell, touch, taste, and hearing).

Quantitative research 6

  • Sampling: The most common type is probability sampling.
  • Interviews: Commonly telephonic or done in-person.
  • Observations: Structured observations are most commonly used in quantitative research. In this method, researchers make observations about specific behaviors of individuals in a structured setting.
  • Document review: Reviewing existing research or documents to collect evidence for supporting the research.
  • Surveys and questionnaires. Surveys can be administered both online and offline depending on the requirement and sample size.

Let Paperpal help you write the perfect research methods section. Start now!

What are data analysis methods.

The data collected using the various methods for qualitative and quantitative research need to be analyzed to generate meaningful conclusions. These data analysis methods 7 also differ between quantitative and qualitative research.

Quantitative research involves a deductive method for data analysis where hypotheses are developed at the beginning of the research and precise measurement is required. The methods include statistical analysis applications to analyze numerical data and are grouped into two categories—descriptive and inferential.

Descriptive analysis is used to describe the basic features of different types of data to present it in a way that ensures the patterns become meaningful. The different types of descriptive analysis methods are:

  • Measures of frequency (count, percent, frequency)
  • Measures of central tendency (mean, median, mode)
  • Measures of dispersion or variation (range, variance, standard deviation)
  • Measure of position (percentile ranks, quartile ranks)

Inferential analysis is used to make predictions about a larger population based on the analysis of the data collected from a smaller population. This analysis is used to study the relationships between different variables. Some commonly used inferential data analysis methods are:

  • Correlation: To understand the relationship between two or more variables.
  • Cross-tabulation: Analyze the relationship between multiple variables.
  • Regression analysis: Study the impact of independent variables on the dependent variable.
  • Frequency tables: To understand the frequency of data.
  • Analysis of variance: To test the degree to which two or more variables differ in an experiment.

Qualitative research involves an inductive method for data analysis where hypotheses are developed after data collection. The methods include:

  • Content analysis: For analyzing documented information from text and images by determining the presence of certain words or concepts in texts.
  • Narrative analysis: For analyzing content obtained from sources such as interviews, field observations, and surveys. The stories and opinions shared by people are used to answer research questions.
  • Discourse analysis: For analyzing interactions with people considering the social context, that is, the lifestyle and environment, under which the interaction occurs.
  • Grounded theory: Involves hypothesis creation by data collection and analysis to explain why a phenomenon occurred.
  • Thematic analysis: To identify important themes or patterns in data and use these to address an issue.

How to choose a research methodology?

Here are some important factors to consider when choosing a research methodology: 8

  • Research objectives, aims, and questions —these would help structure the research design.
  • Review existing literature to identify any gaps in knowledge.
  • Check the statistical requirements —if data-driven or statistical results are needed then quantitative research is the best. If the research questions can be answered based on people’s opinions and perceptions, then qualitative research is most suitable.
  • Sample size —sample size can often determine the feasibility of a research methodology. For a large sample, less effort- and time-intensive methods are appropriate.
  • Constraints —constraints of time, geography, and resources can help define the appropriate methodology.

Got writer’s block? Kickstart your research paper writing with Paperpal now!

How to write a research methodology .

A research methodology should include the following components: 3,9

  • Research design —should be selected based on the research question and the data required. Common research designs include experimental, quasi-experimental, correlational, descriptive, and exploratory.
  • Research method —this can be quantitative, qualitative, or mixed-method.
  • Reason for selecting a specific methodology —explain why this methodology is the most suitable to answer your research problem.
  • Research instruments —explain the research instruments you plan to use, mainly referring to the data collection methods such as interviews, surveys, etc. Here as well, a reason should be mentioned for selecting the particular instrument.
  • Sampling —this involves selecting a representative subset of the population being studied.
  • Data collection —involves gathering data using several data collection methods, such as surveys, interviews, etc.
  • Data analysis —describe the data analysis methods you will use once you’ve collected the data.
  • Research limitations —mention any limitations you foresee while conducting your research.
  • Validity and reliability —validity helps identify the accuracy and truthfulness of the findings; reliability refers to the consistency and stability of the results over time and across different conditions.
  • Ethical considerations —research should be conducted ethically. The considerations include obtaining consent from participants, maintaining confidentiality, and addressing conflicts of interest.

Streamline Your Research Paper Writing Process with Paperpal

The methods section is a critical part of the research papers, allowing researchers to use this to understand your findings and replicate your work when pursuing their own research. However, it is usually also the most difficult section to write. This is where Paperpal can help you overcome the writer’s block and create the first draft in minutes with Paperpal Copilot, its secure generative AI feature suite.  

With Paperpal you can get research advice, write and refine your work, rephrase and verify the writing, and ensure submission readiness, all in one place. Here’s how you can use Paperpal to develop the first draft of your methods section.  

  • Generate an outline: Input some details about your research to instantly generate an outline for your methods section 
  • Develop the section: Use the outline and suggested sentence templates to expand your ideas and develop the first draft.  
  • P araph ras e and trim : Get clear, concise academic text with paraphrasing that conveys your work effectively and word reduction to fix redundancies. 
  • Choose the right words: Enhance text by choosing contextual synonyms based on how the words have been used in previously published work.  
  • Check and verify text : Make sure the generated text showcases your methods correctly, has all the right citations, and is original and authentic. .   

You can repeat this process to develop each section of your research manuscript, including the title, abstract and keywords. Ready to write your research papers faster, better, and without the stress? Sign up for Paperpal and start writing today!

Frequently Asked Questions

Q1. What are the key components of research methodology?

A1. A good research methodology has the following key components:

  • Research design
  • Data collection procedures
  • Data analysis methods
  • Ethical considerations

Q2. Why is ethical consideration important in research methodology?

A2. Ethical consideration is important in research methodology to ensure the readers of the reliability and validity of the study. Researchers must clearly mention the ethical norms and standards followed during the conduct of the research and also mention if the research has been cleared by any institutional board. The following 10 points are the important principles related to ethical considerations: 10

  • Participants should not be subjected to harm.
  • Respect for the dignity of participants should be prioritized.
  • Full consent should be obtained from participants before the study.
  • Participants’ privacy should be ensured.
  • Confidentiality of the research data should be ensured.
  • Anonymity of individuals and organizations participating in the research should be maintained.
  • The aims and objectives of the research should not be exaggerated.
  • Affiliations, sources of funding, and any possible conflicts of interest should be declared.
  • Communication in relation to the research should be honest and transparent.
  • Misleading information and biased representation of primary data findings should be avoided.

Q3. What is the difference between methodology and method?

A3. Research methodology is different from a research method, although both terms are often confused. Research methods are the tools used to gather data, while the research methodology provides a framework for how research is planned, conducted, and analyzed. The latter guides researchers in making decisions about the most appropriate methods for their research. Research methods refer to the specific techniques, procedures, and tools used by researchers to collect, analyze, and interpret data, for instance surveys, questionnaires, interviews, etc.

Research methodology is, thus, an integral part of a research study. It helps ensure that you stay on track to meet your research objectives and answer your research questions using the most appropriate data collection and analysis tools based on your research design.

Accelerate your research paper writing with Paperpal. Try for free now!

  • Research methodologies. Pfeiffer Library website. Accessed August 15, 2023. https://library.tiffin.edu/researchmethodologies/whatareresearchmethodologies
  • Types of research methodology. Eduvoice website. Accessed August 16, 2023. https://eduvoice.in/types-research-methodology/
  • The basics of research methodology: A key to quality research. Voxco. Accessed August 16, 2023. https://www.voxco.com/blog/what-is-research-methodology/
  • Sampling methods: Types with examples. QuestionPro website. Accessed August 16, 2023. https://www.questionpro.com/blog/types-of-sampling-for-social-research/
  • What is qualitative research? Methods, types, approaches, examples. Researcher.Life blog. Accessed August 15, 2023. https://researcher.life/blog/article/what-is-qualitative-research-methods-types-examples/
  • What is quantitative research? Definition, methods, types, and examples. Researcher.Life blog. Accessed August 15, 2023. https://researcher.life/blog/article/what-is-quantitative-research-types-and-examples/
  • Data analysis in research: Types & methods. QuestionPro website. Accessed August 16, 2023. https://www.questionpro.com/blog/data-analysis-in-research/#Data_analysis_in_qualitative_research
  • Factors to consider while choosing the right research methodology. PhD Monster website. Accessed August 17, 2023. https://www.phdmonster.com/factors-to-consider-while-choosing-the-right-research-methodology/
  • What is research methodology? Research and writing guides. Accessed August 14, 2023. https://paperpile.com/g/what-is-research-methodology/
  • Ethical considerations. Business research methodology website. Accessed August 17, 2023. https://research-methodology.net/research-methodology/ethical-considerations/

Paperpal is a comprehensive AI writing toolkit that helps students and researchers achieve 2x the writing in half the time. It leverages 21+ years of STM experience and insights from millions of research articles to provide in-depth academic writing, language editing, and submission readiness support to help you write better, faster.  

Get accurate academic translations, rewriting support, grammar checks, vocabulary suggestions, and generative AI assistance that delivers human precision at machine speed. Try for free or upgrade to Paperpal Prime starting at US$19 a month to access premium features, including consistency, plagiarism, and 30+ submission readiness checks to help you succeed.  

Experience the future of academic writing – Sign up to Paperpal and start writing for free!  

Related Reads:

  • Dangling Modifiers and How to Avoid Them in Your Writing 
  • Webinar: How to Use Generative AI Tools Ethically in Your Academic Writing
  • Research Outlines: How to Write An Introduction Section in Minutes with Paperpal Copilot
  • How to Paraphrase Research Papers Effectively

Language and Grammar Rules for Academic Writing

Climatic vs. climactic: difference and examples, you may also like, what is academic writing: tips for students, what is hedging in academic writing  , how to use ai to enhance your college..., how to use paperpal to generate emails &..., ai in education: it’s time to change the..., is it ethical to use ai-generated abstracts without..., do plagiarism checkers detect ai content, word choice problems: how to use the right..., how to avoid plagiarism when using generative ai..., what are journal guidelines on using generative ai....

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List

Logo of springeropen

Research design: the methodology for interdisciplinary research framework

1 Biometris, Wageningen University and Research, PO Box 16, 6700 AA Wageningen, The Netherlands

Jarl K. Kampen

2 Statua, Dept. of Epidemiology and Medical Statistics, Antwerp University, Venusstraat 35, 2000 Antwerp, Belgium

Many of today’s global scientific challenges require the joint involvement of researchers from different disciplinary backgrounds (social sciences, environmental sciences, climatology, medicine, etc.). Such interdisciplinary research teams face many challenges resulting from differences in training and scientific culture. Interdisciplinary education programs are required to train truly interdisciplinary scientists with respect to the critical factor skills and competences. For that purpose this paper presents the Methodology for Interdisciplinary Research (MIR) framework. The MIR framework was developed to help cross disciplinary borders, especially those between the natural sciences and the social sciences. The framework has been specifically constructed to facilitate the design of interdisciplinary scientific research, and can be applied in an educational program, as a reference for monitoring the phases of interdisciplinary research, and as a tool to design such research in a process approach. It is suitable for research projects of different sizes and levels of complexity, and it allows for a range of methods’ combinations (case study, mixed methods, etc.). The different phases of designing interdisciplinary research in the MIR framework are described and illustrated by real-life applications in teaching and research. We further discuss the framework’s utility in research design in landscape architecture, mixed methods research, and provide an outlook to the framework’s potential in inclusive interdisciplinary research, and last but not least, research integrity.

Introduction

Current challenges, e.g., energy, water, food security, one world health and urbanization, involve the interaction between humans and their environment. A (mono)disciplinary approach, be it a psychological, economical or technical one, is too limited to capture any one of these challenges. The study of the interaction between humans and their environment requires knowledge, ideas and research methodology from different disciplines (e.g., ecology or chemistry in the natural sciences, psychology or economy in the social sciences). So collaboration between natural and social sciences is called for (Walsh et al. 1975 ).

Over the past decades, different forms of collaboration have been distinguished although the terminology used is diverse and ambiguous. For the present paper, the term interdisciplinary research is used for (Aboelela et al. 2007 , p. 341):

any study or group of studies undertaken by scholars from two or more distinct scientific disciplines. The research is based upon a conceptual model that links or integrates theoretical frameworks from those disciplines, uses study design and methodology that is not limited to any one field, and requires the use of perspectives and skills of the involved disciplines throughout multiple phases of the research process.

Scientific disciplines (e.g., ecology, chemistry, biology, psychology, sociology, economy, philosophy, linguistics, etc.) are categorized into distinct scientific cultures: the natural sciences, the social sciences and the humanities (Kagan 2009 ). Interdisciplinary research may involve different disciplines within a single scientific culture, and it can also cross cultural boundaries as in the study of humans and their environment.

A systematic review of the literature on natural-social science collaboration (Fischer et al. 2011 ) confirmed the general impression of this collaboration to be a challenge. The nearly 100 papers in their analytic set mentioned more instances of barriers than of opportunities (72 and 46, respectively). Four critical factors for success or failure in natural-social science collaboration were identified: the paradigms or epistemologies in the current (mono-disciplinary) sciences, the skills and competences of the scientists involved, the institutional context of the research, and the organization of collaborations (Fischer et al. 2011 ). The so-called “paradigm war” between neopositivist versus constructivists within the social and behavioral sciences (Onwuegbuzie and Leech 2005 ) may complicate pragmatic collaboration further.

It has been argued that interdisciplinary education programs are required to train truly interdisciplinary scientists with respect to the critical factor skills and competences (Frischknecht 2000 ) and accordingly, some interdisciplinary programs have been developed since (Baker and Little 2006 ; Spelt et al. 2009 ). The overall effect of interdisciplinary programs can be expected to be small as most programs are mono-disciplinary and based on a single paradigm (positivist-constructivist, qualitative-quantitative; see e.g., Onwuegbuzie and Leech 2005 ). We saw in our methodology teaching, consultancy and research practices working with heterogeneous groups of students and staff, that most had received mono-disciplinary training with a minority that had received multidisciplinary training, with few exceptions within the same paradigm. During our teaching and consultancy for heterogeneous groups of students and staff aimed at designing interdisciplinary research, we built the framework for methodology in interdisciplinary research (MIR). With the MIR framework, we aspire to contribute to the critical factors skills and competences (Fischer et al. 2011 ) for social and natural sciences collaboration. Note that the scale of interdisciplinary research projects we have in mind may vary from comparably modest ones (e.g., finding a link between noise reducing asphalt and quality of life; Vuye et al. 2016 ) to very large projects (finding a link between anthropogenic greenhouse gas emissions, climate change, and food security; IPCC 2015 ).

In the following section of this paper we describe the MIR framework and elaborate on its components. The third section gives two examples of the application of the MIR framework. The paper concludes with a discussion of the MIR framework in the broader contexts of mixed methods research, inclusive research, and other promising strains of research.

The methodology in interdisciplinary research framework

Research as a process in the methodology in interdisciplinary research framework.

The Methodology for Interdisciplinary Research (MIR) framework was built on the process approach (Kumar 1999 ), because in the process approach, the research question or hypothesis is leading for all decisions in the various stages of research. That means that it helps the MIR framework to put the common goal of the researchers at the center, instead of the diversity of their respective backgrounds. The MIR framework also introduces an agenda: the research team needs to carefully think through different parts of the design of their study before starting its execution (Fig.  1 ). First, the team discusses the conceptual design of their study which contains the ‘why’ and ‘what’ of the research. Second, the team discusses the technical design of the study which contains the ‘how’ of the research. Only after the team agrees that the complete research design is sufficiently crystalized, the execution of the work (including fieldwork) starts.

An external file that holds a picture, illustration, etc.
Object name is 11135_2017_513_Fig1_HTML.jpg

The Methodology of Interdisciplinary Research framework

Whereas the conceptual and technical designs are by definition interdisciplinary team work, the respective team members may do their (mono)disciplinary parts of fieldwork and data analysis on a modular basis (see Bruns et al. 2017 : p. 21). Finally, when all evidence is collected, an interdisciplinary synthesis of analyses follows which conclusions are input for the final report. This implies that the MIR framework allows for a range of scales of research projects, e.g., a mixed methods project and its smaller qualitative and quantitative modules, or a multi-national sustainability project and its national sociological, economic and ecological modules.

The conceptual design

Interdisciplinary research design starts with the “conceptual design” which addresses the ‘why’ and ‘what’ of a research project at a conceptual level to ascertain the common goals pivotal to interdisciplinary collaboration (Fischer et al. 2011 ). The conceptual design includes mostly activities such as thinking, exchanging interdisciplinary knowledge, reading and discussing. The product of the conceptual design is called the “conceptual frame work” which comprises of the research objective (what is to be achieved by the research), the theory or theories that are central in the research project, the research questions (what knowledge is to be produced), and the (partial) operationalization of constructs and concepts that will be measured or recorded during execution. While the members of the interdisciplinary team and the commissioner of the research must reach a consensus about the research objective, the ‘why’, the focus in research design must be the production of the knowledge required to achieve that objective the ‘what’.

With respect to the ‘why’ of a research project, an interdisciplinary team typically starts with a general aim as requested by the commissioner or funding agency, and a set of theories to formulate a research objective. This role of theory is not always obvious to students from the natural sciences, who tend to think in terms of ‘models’ with directly observable variables. On the other hand, students from the social sciences tend to think in theories with little attention to observable variables. In the MIR framework, models as simplified descriptions or explanations of what is studied in the natural sciences play the same role in informing research design, raising research questions, and informing how a concept is understood, as do theories in social science.

Research questions concern concepts, i.e. general notions or ideas based on theory or common sense that are multifaceted and not directly visible or measurable. For example, neither food security (with its many different facets) nor a person’s attitude towards food storage may be directly observed. The operationalization of concepts, the transformation of concepts into observable indicators, in interdisciplinary research requires multiple steps, each informed by theory. For instance, in line with particular theoretical frameworks, sustainability and food security may be seen as the composite of a social, an economic and an ecological dimension (e.g., Godfray et al. 2010 ).

As the concept of interest is multi-disciplinary and multi-dimensional, the interdisciplinary team will need to read, discuss and decide on how these dimensions and their indicators are weighted to measure the composite interdisciplinary concept to get the required interdisciplinary measurements. The resulting measure or measures for the interdisciplinary concept may be of the nominal, ordinal, interval and ratio level, or a combination thereof. This operationalization procedure is known as the port-folio approach to widely defined measurements (Tobi 2014 ). Only after the research team has finalized the operationalization of the concepts under study, the research questions and hypotheses can be made operational. For example, a module with descriptive research questions may now be turned into an operational one like, what are the means and variances of X1, X2, and X3 in a given population? A causal research question may take on the form, is X (a composite of X1, X2 and X3) a plausible cause for the presence or absence of Y? A typical qualitative module could study, how do people talk about X1, X2 and X3 in their everyday lives?

The technical design

Members of an interdisciplinary team usually have had different training with respect to research methods, which makes discussing and deciding on the technical design more challenging but also potentially more creative than in a mono-disciplinary team. The technical design addresses the issues ‘how, where and when will research units be studied’ (study design), ‘how will measurement proceed’ (instrument selection or design), ‘how and how many research units will be recruited’ (sampling plan), and ‘how will collected data be analyzed and synthesized’ (analysis plan). The MIR framework provides the team a set of topics and their relationships to one another and to generally accepted quality criteria (see Fig.  1 ), which helps in designing this part of the project.

Interdisciplinary teams need be pragmatic as the research questions agreed on are leading in decisions on the data collection set-up (e.g., a cross-sectional study of inhabitants of a region, a laboratory experiment, a cohort study, a case control study, etc.), the so-called “study design” (e.g., Kumar 2014 ; De Vaus 2001 ; Adler and Clark 2011 ; Tobi and van den Brink 2017 ) instead of traditional ‘pet’ approaches. Typical study designs for descriptive research questions and research questions on associations are the cross-sectional study design. Longitudinal study designs are required to investigate development over time and cause-effect relationships ideally are studied in experiments (e.g., Kumar 2014 ; Shipley 2016 ). Phenomenological questions concern a phenomenon about which little is known and which has to be studied in the environment where it takes place, which calls for a case study design (e.g., Adler and Clark 2011 : p. 178). For each module, the study design is to be further explicated by the number of data collection waves, the level of control by the researcher and its reference period (e.g., Kumar 2014 ) to ensure the teams common understanding.

Then, decisions about the way data is to be collected, e.g., by means of certified instruments, observation, interviews, questionnaires, queries on existing data bases, or a combination of these are to be made. It is especially important to discuss the role of the observer (researcher) as this is often a source of misunderstanding in interdisciplinary teams. In the sciences, the observer is usually considered a neutral outsider when reading a standardized measurement instrument (e.g., a pyranometer to measure incoming solar radiation). In contrast, in the social sciences, the observer may be (part of) the measurement instrument, for example in participant observation or when doing in-depth interviews. After all, in participant observation the researcher observes from a member’s perspective and influences what is observed owing to the researcher’s participation (Flick 2006 : p. 220). Similarly in interviews, by which we mean “a conversation that has a structure and a purpose determined by the one party—the interviewer” (Kvale 2007 : p. 7), the interviewer and the interviewee are part of the measurement instrument (Kvale and Brinkmann 2009 : p. 2). In on-line and mail questionnaires the interviewer is eliminated as part of the instrument by standardizing the questions and answer options. Queries on existing data bases refer to the use of secondary data or secondary analysis. Different disciplines tend to use different bibliographic data bases (e.g., CAB Abstracts, ABI/INFORM or ERIC) and different data repositories (e.g., the European Social Survey at europeansocialsurvey.org or the International Council for Science data repository hosted by www.pangaea.de ).

Depending on whether or not the available, existing, measurement instruments tally with the interdisciplinary operationalisations from the conceptual design, the research team may or may not need to design instruments. Note that in some cases the social scientists’ instinct may be to rely on a questionnaire whereas the collaboration with another discipline may result in more objective possibilities (e.g., compare asking people about what they do with surplus medication, versus measuring chemical components from their input into the sewer system). Instrument design may take on different forms, such as the design of a device (e.g., pyranometer), a questionnaire (Dillman 2007 ) or a part thereof (e.g., a scale see DeVellis 2012 ; Danner et al. 2016 ), an interview guide with topics or questions for the interviewees, or a data extraction form in the context of secondary analysis and literature review (e.g., the Cochrane Collaboration aiming at health and medical sciences or the Campbell Collaboration aiming at evidence based policies).

Researchers from different disciplines are inclined to think of different research objects (e.g., animals, humans or plots), which is where the (specific) research questions come in as these identify the (possibly different) research objects unambiguously. In general, research questions that aim at making an inventory, whether it is an inventory of biodiversity or of lodging, call for a random sampling design. Both in the biodiversity and lodging example, one may opt for random sampling of geographic areas by means of a list of coordinates. Studies that aim to explain a particular phenomenon in a particular context would call for a purposive sampling design (non-random selection). Because studies of biodiversity and housing obey the same laws in terms of appropriate sampling design for similar research questions, individual students and researchers are sensitized to commonalities of their respective (mono)disciplines. For example, a research team interested in the effects of landslides on a socio-ecological system may select for their study one village that suffered from landslides and one village that did not suffer from landslides that have other characteristics in common (e.g., kind of soil, land use, land property legislation, family structure, income distribution, et cetera).

The data analysis plan describes how data will be analysed, for each of the separate modules and for the project at large. In the context of a multi-disciplinary quantitative research project, the data analysis plan will list the intended uni-, bi- and multivariate analyses such as measures for distributions (e.g., means and variances), measures for association (e.g., Pearson Chi square or Kendall Tau) and data reduction and modelling techniques (e.g., factor analysis and multiple linear regression or structural equation modelling) for each of the research modules using the data collected. When applicable, it will describe interim analyses and follow-up rules. In addition to the plans at modular level, the data analysis plan must describe how the input from the separate modules, i.e. different analyses, will be synthesized to answer the overall research question. In case of mixed methods research, the particular type of mixed methods design chosen describes how, when, and to what extent the team will synthesize the results from the different modules.

Unfortunately, in our experience, when some of the research modules rely on a qualitative approach, teams tend to refrain from designing a data analysis plan before starting the field work. While absence of a data analysis plan may be regarded acceptable in fields that rely exclusively on qualitative research (e.g., ethnography), failure to communicate how data will be analysed and what potential evidence will be produced posits a deathblow to interdisciplinarity. For many researchers not familiar with qualitative research, the black box presented as “qualitative data analysis” is a big hurdle, and a transparent and systematic plan is a sine qua non for any scientific collaboration. The absence of a data analysis plan for all modules results in an absence of synthesis of perspectives and skills of the disciplines involved, and in separate (disciplinary) research papers or separate chapters in the research report without an answer to the overall research question. So, although researchers may find it hard to write the data analysis plan for qualitative data, it is pivotal in interdisciplinary research teams.

Similar to the quantitative data analysis plan, the qualitative data analysis plan presents the description of how the researcher will get acquainted with the data collected (e.g., by constructing a narrative summary per interviewee or a paired-comparison of essays). Additionally, the rules to decide on data saturation need be presented. Finally, the types of qualitative analyses are to be described in the data analysis plan. Because there is little or no standardized terminology in qualitative data analysis, it is important to include a precise description as well as references to the works that describe the method intended (e.g., domain analysis as described by Spradley 1979 ; or grounded theory by means of constant-comparison as described by Boeije 2009 ).

Integration

To benefit optimally from the research being interdisciplinary the modules need to be brought together in the integration stage. The modules may be mono- or interdisciplinary and may rely on quantitative, qualitative or mixed methods approaches. So the MIR framework fits the view that distinguishes three multimethods approaches (quali–quali, quanti–quanti, and quali–quant).

Although the MIR framework has not been designed with the intention to promote mixed methods research, it is suitable for the design of mixed methods research as the kind of research that calls for both quantitative and qualitative components (Creswell and Piano Clark 2011 ). Indeed, just like the pioneers in mixed methods research (Creswell and Piano Clark 2011 : p. 2), the MIR framework deconstructs the package deals of paradigm and data to be collected. The synthesis of the different mono or interdisciplinary modules may benefit from research done on “the unique challenges and possibilities of integration of qualitative and quantitative approaches” (Fetters and Molina-Azorin 2017 : p. 5). We distinguish (sub) sets of modules being designed as convergent, sequential or embedded (adapted from mixed methods design e.g., Creswell and Piano Clark 2011 : pp. 69–70). Convergent modules, whether mono or interdisciplinary, may be done parallel and are integrated after completion. Sequential modules are done after one another and the first modules inform the latter ones (this includes transformative and multiphase mixed methods design). Embedded modules are intertwined. Here, modules depend on one another for data collection and analysis, and synthesis may be planned both during and after completion of the embedded modules.

Scientific quality and ethical considerations in the design of interdisciplinary research

A minimum set of jargon related to the assessment of scientific quality of research (e.g., triangulation, validity, reliability, saturation, etc.) can be found scattered in Fig.  1 . Some terms are reserved by particular paradigms, others may be seen in several paradigms with more or less subtle differences in meaning. In the latter case, it is important that team members are prepared to explain and share ownership of the term and respect the different meanings. By paying explicit attention to the quality concepts, researchers from different disciplines learn to appreciate each other’s concerns for good quality research and recognize commonalities. For example, the team may discuss measurement validity of both a standardized quantitative instrument and that of an interview and discover that the calibration of the machine serves a similar purpose as the confirmation of the guarantee of anonymity at the start of an interview.

Throughout the process of research design, ethics require explicit discussion among all stakeholders in the project. Ethical issues run through all components in the MIR framework in Fig.  1 . Where social and medical scientists may be more sensitive to ethical issues related to humans (e.g., the 1979 Belmont Report criteria of beneficence, justice, and respect), others may be more sensitive to issues related to animal welfare, ecology, legislation, the funding agency (e.g., implications for policy), data and information sharing (e.g., open access publishing), sloppy research practices, or long term consequences of the research. This is why ethics are an issue for the entire interdisciplinary team and cannot be discussed on project module level only.

The MIR framework in practice: two examples

Teaching research methodology to heterogeneous groups of students, institutional context and background of the mir framework.

Wageningen University and Research (WUR) advocates in its teaching and research an interdisciplinary approach to the study of global issues related to the motto “To explore the potential of nature to improve the quality of life.” Wageningen University’s student population is multidisciplinary and international (e.g., Tobi and Kampen 2013 ). Traditionally, this challenge of diversity in one classroom is met by covering a width of methodological topics and examples from different disciplines. However, when students of various programmes received methodological education in mixed classes, students of some disciplines would regard with disinterest or even disdain methods and techniques of the other disciplines. Different disciplines, especially from the qualitative respectively quantitative tradition in the social sciences (Onwuegbuzie and Leech 2005 : p. 273), claim certain study designs, methods of data collection and analysis as their territory, a claim reflected in many textbooks. We found that students from a qualitative tradition would not be interested, and would not even study, content like the design of experiments and quantitative data collection; and students from a quantitative tradition would ignore case study design and qualitative data collection. These students assumed they didn’t need any knowledge about ‘the other tradition’ for their future careers, despite the call for interdisciplinarity.

To enhance interdisciplinarity, WUR provides an MSc course mandatory for most students, in which multi-disciplinary teams do research for a commissioner. Students reported difficulties similar to the ones found in the literature: miscommunication due to talking different scientific languages and feelings of distrust and disrespect due to prejudice. This suggested that research methodology courses ought help prepare for interdisciplinary collaboration by introducing a single methodological framework that 1) creates sensitivity to the pros and challenges of interdisciplinary research by means of a common vocabulary and fosters respect for other disciplines, 2) starts from the research questions as pivotal in decision making on research methods instead of tradition or ontology, and 3) allows available methodologies and methods to be potentially applicable to any scientific research problem.

Teaching with MIR—the conceptual framework

As a first step, we replaced textbooks by ones refusing the idea that any scientific tradition has exclusive ownership of any methodological approach or method. The MIR framework further guides our methodology teaching in two ways. First, it presents a logical sequence of topics (first conceptual design, then technical design; first research question(s) or hypotheses, then study design; etc.). Second, it allows for a conceptual separation of topics (e.g., study design from instrument design). Educational programmes at Wageningen University and Research consistently stress the vital importance of good research design. In fact, 50% of the mark in most BSc and MSc courses in research methodology is based on the assessment of a research proposal that students design in small (2-4 students) and heterogeneous (discipline, gender and nationality) groups. The research proposal must describe a project which can be executed in practice, and which limitations (measurement, internal, and external validity) are carefully discussed.

Groups start by selecting a general research topic. They discuss together previously attained courses from a range of programs to identify personal and group interests, with the aim to reach an initial research objective and a general research question as input for the conceptual design. Often, their initial research objective and research question are too broad to be researchable (e.g., Kumar 2014 : p. 64; Adler and Clark 2011 : p. 71). In plenary sessions, the (basics of) critical assessment of empirical research papers is taught with special attention to the ‘what’ and ‘why’ section of research papers. During tutorials students generate research questions until the group agrees on a research objective, with one general research question that consists of a small set of specific research questions. Each of the specific research questions may stem from a different discipline, whereas answering the general research question requires integrating the answers to all specific research questions.

The group then identifies the key concepts in their research questions, while exchanging thoughts on possible attributes based on what they have learnt from previous courses (theories) and literature. When doing so they may judge the research question as too broad, in which case they will turn to the question strategies toolbox again. Once they agree on the formulation of the research questions and the choice of concepts, tasks are divided. In general, each student turns to the literature he/she is most familiar with or interested in, for the operationalization of the concept into measurable attributes and writes a paragraph or two about it. In the next meeting, the groups read and discuss the input and decide on the set-up and division of tasks with respect to the technical design.

Teaching with MIR—the technical framework

The technical part of research design distinguishes between study design, instrument design, sampling design, and the data analysis plan. In class, we first present students with a range of study designs (cross sectional, experimental, etc.). Student groups select an appropriate study design by comparing the demands made by the research questions with criteria for internal validity. When a (specific) research question calls for a study design that is not seen as practically feasible or ethically possible, they will rephrase the research question until the demands of the research question tally with the characteristics of at least one ethical, feasible and internally valid study design.

While following plenary sessions during which different random and non-random sampling or selection strategies are taught, groups start working on their sampling design. The groups make two decisions informed by their research question: the population(s) of research units, and the requirements of the sampling strategy for each population. Like many other aspects in research design, this can be an iterative process. For example, suppose the research question mentioned “local policy makers,” which is too vague for a sampling design. Then the decision may be to limit the study to “policy makers at the municipality level in the Netherlands” and adapt the general and the specific research questions accordingly. Next, the group identifies whether a sample design needs to focus on diversity (e.g., when the objective is to make an inventory of possible local policies), representativeness (e.g., when the objective is to estimate prevalence of types of local policies), or people with particular information (e.g., when the objective is to study people having experience with a given local policy). When a sample has to representative, the students must produce an assessment of external validity, whereas when the aim is to map diversity the students must discuss possible ways of source triangulation. Finally, in conjunction with the data analysis plan, students decide on the sample size and/or the saturation criteria.

When the group has agreed on their population(s) and the strategy for recruiting research units, the next step is to finalize the technical aspects of operationalisation i.e. addressing the issue of exactly how information will be extracted from the research units. Depending on what is practically feasible qua measurement, the choice of a data collection instrument may be a standardised (e.g., a spectrograph, a questionnaire) or less standardised (e.g., semi-structured interviews, visual inspection) one. The students have to discuss the possibilities of method triangulation, and explain the possible weaknesses of their data collection plan in terms of measurement validity and reliability.

Recent developments

Presently little attention is payed to the data analysis plan, procedures for synthesis and reporting because the programmes differ on their offer in data analysis courses, and because execution of the research is not part of the BSc and MSc methodology courses. Recently, we have designed one course for an interdisciplinary BSc program in which the research question is put central in learning and deciding on statistics and qualitative data analysis. Nonetheless, during the past years the number of methodology courses for graduate students that supported the MIR framework have been expanded, e.g., a course “From Topic to Proposal”; separate training modules on questionnaire construction, interviewing, and observation; and optional courses on quantitative and qualitative data analysis. These courses are open to (and attended by) PhD students regardless of their program. In Flanders (Belgium), the Flemish Training Network for Statistics and Methodology (FLAMES) has for the last four years successfully applied the approach outlined in Fig.  1 in its courses for research design and data collection methods. The division of the research process in terms of a conceptual design, technical design, operationalisation, analysis plan, and sampling plan, has proved to be appealing for students of disciplines ranging from linguistics to bioengineering.

Researching with MIR: noise reducing asphalt layers and quality of life

Research objective and research question.

This example of the application of the MIR framework comes from a study about the effects of “noise reducing asphalt layers” on the quality of life (Vuye et al. 2016 ), a project commissioned by the City of Antwerp in 2015 and executed by a multidisciplinary research team of Antwerp University (Belgium). The principal researcher was an engineer from the Faculty of Applied Engineering (dept. Construction), supported by two researchers from the Faculty of Medicine and Health Sciences (dept. of Epidemiology and Social Statistics), one with a background in qualitative and one with a background in quantitative research methods. A number of meetings were held where the research team and the commissioners discussed the research objective (the ‘what’ and ‘why’).The research objective was in part dictated by the European Noise Directive 2002/49/EC, which forces all EU member states to draft noise action plans, and the challenge in this study was to produce evidence of a link between the acoustic and mechanical properties of different types of asphalt, and the quality of life of people living in the vicinity of the treated roads. While there was literature available about the effects of road surface on sound, and other studies had studied the link between noise and health, no study was found that produced evidence simultaneously about noise levels of roads and quality of life. The team therefore decided to test the hypothesis that traffic noise reduction has a beneficial effect on the quality of life of people into the central research. The general research question was, “to what extent does the placing of noise reducing asphalt layers increase the quality of life of the residents?”

Study design

In order to test the effect of types of asphalt, initially a pretest–posttest experiment was designed, which was expanded by several added experimental (change of road surface) and control (no change of road surface) groups. The research team gradually became aware that quality of life may not be instantly affected by lower noise levels, and that a time lag is involved. A second posttest aimed to follow up on this effect although it could only be implemented in a selection of experimental sites.

Instrument selection and design

Sound pressure levels were measured by an ISO-standardized procedure called the Statistical Pass-By (SPB) method. A detailed description of the method is in Vuye et al. ( 2016 ). No such objective procedure is available for measuring quality of life, which can only be assessed by self-reports of the residents. Some time was needed for the research team to accept that measuring a multidimensional concept like quality of life is more complicated than just having people rate their “quality of life” on a 10 point scale. For instance, questions had to be phrased in a way that gave not away the purpose of the research (Hawthorne effect), leading to the inclusion of questions about more nuisances than traffic noise alone. This led to the design of a self-administered questionnaire, with questions of Flanders Survey on Living Environment (Departement Leefmilieu, Natuur & Energie 2013 ) appended by new questions. Among other things, the questionnaire probed for experienced nuisance by sound, quality of sleep, effort to concentrate, effort to have a conversation inside or outside the home, physical complaints such as headaches, etc.

Sampling design

The selected sites needed to accommodate both types of measurements: that of noise from traffic and quality of life of residents. This was a complicating factor that required several rounds of deliberation. While countrywide only certain roads were available for changing the road surface, these roads had to be mutually comparable in terms of the composition of the population, type of residential area (e.g., reports from the top floor of a tall apartment building cannot be compared to those at ground level), average volume of traffic, vicinity of hospitals, railroads and airports, etc. At the level of roads therefore, targeted sampling was applied, whereas at the level of residents the aim was to realize a census of all households within a given perimeter from the treated road surfaces. Considerations about the reliability of applied instruments were guiding decisions with respect to sampling. While the measurements of the SPB method were sufficiently reliable to allow for relatively few measurements, the questionnaire suffered from considerable nonresponse which hampered statistical power. It was therefore decided to increase the power of the study by adding control groups in areas where the road surface was not replaced. This way, detecting an effect of the intervention did not solely depend on the turnout of the pre and the post-test.

Data analysis plan

The statistical analysis had to account for the fact that data were collected at two different levels: the level of the residents filling out the questionnaires, and the level of the roads which surface was changed. Because survey participation was confidential, results of the pre- and posttest could only be compared at aggregate (street) level. The analysis had to control for confounding variables (e.g., sample composition, variety in traffic volume, etc.), experimental factors (varieties in experimental conditions, and controls), and non-normal dependent variables. The statistical model appropriate for analysis of such data is a Generalised Linear Mixed Model.

Data were collected during the course of 2015, 2016 and 2017 and are awaiting final analysis in Spring 2017. Intermediate analyses resulted in several MSc theses, conference presentations, and working papers that reported on parts of the research.

In this paper we presented the Methodology in Interdisciplinary Research framework that we developed over the past decade building on our experience as lecturers, consultants and researchers. The MIR framework recognizes research methodology and methods as important content in the critical factor skills and competences. It approaches research and collaboration as a process that needs to be designed with the sole purpose to answer the general research question. For the conceptual design the team members have to discuss and agree on the objective of their communal efforts without squeezing it into one single discipline and, thus, ignoring complexity. The specific research questions, when formulated, contribute to (self) respect in collaboration as they represent and stand witness of the need for interdisciplinarity. In the technical design, different parts were distinguished to stimulate researchers to think and design research out of their respective disciplinary boxes and consider, for example, an experimental design with qualitative data collection, or a case study design based on quantitative information.

In our teaching and consultancy, we first developed a MIR framework for social sciences, economics, health and environmental sciences interdisciplinarity. It was challenged to include research in the design discipline of landscape architecture. What characterizes research in landscape architecture and other design principles, is that the design product as well as the design process may be the object of study. Lenzholder et al. ( 2017 ) therefore distinguish three kinds of research in landscape architecture. The first kind, “Research into design” studies the design product post hoc and the MIR framework suits the interdisciplinary study of such a product. In contrast, “Research for design” generates knowledge that feeds into the noun and the verb ‘design’, which means it precedes the design(ing). The third kind, Research through Design(ing) employs designing as a research method. At first, just like Deming and Swaffield ( 2011 ), we were a bit skeptical about “designing” as a research method. Lenzholder et al. ( 2017 ) pose that the meaning of research through design has evolved through a (neo)positivist, constructivist and transformative paradigm to include a pragmatic stance that resembles the pragmatic stance assumed in the MIR framework. We learned that, because landscape architecture is such an interdisciplinary field, the process approach and the distinction between a conceptual and technical research design was considered very helpful and embraced by researchers in landscape architecture (Tobi and van den Brink 2017 ).

Mixed methods research (MMR) has been considered to study topics as diverse as education (e.g., Powell et al. 2008 ), environmental management (e.g., Molina-Azorin and Lopez-Gamero 2016 ), health psychology (e.g., Bishop 2015 ) and information systems (e.g., Venkatesh et al. 2013 ). Nonetheless, the MIR framework is the first to put MMR in the context of integrating disciplines beyond social inquiry (Greene 2008 ). The splitting of the research into modules stimulates the identification and recognition of the contribution of both distinct and collaborating disciplines irrespective of whether they contribute qualitative and/or quantitative research in the interdisciplinary research design. As mentioned in Sect.  2.4 the integration of the different research modules in one interdisciplinary project design may follow one of the mixed methods designs. For example, we witnessed at several occasions the integration of social and health sciences in interdisciplinary teams opting for sequential modules in a sequential exploratory mixed methods fashion (e.g., Adamson 2005 : 234). In sustainability science research, we have seen the design of concurrent modules for a concurrent nested mixed methods strategy (ibid) in research integrating the social and natural sciences and economics.

The limitations of the MIR framework are those of any kind of collaboration: it cannot work wonders in the absence of awareness of the necessity and it requires the willingness to work, learn, and research together. We developed MIR framework in and alongside our own teaching, consultancy and research, it has not been formally evaluated and compared in an experiment with teaching, consultancy and research with, for example, the regulative cycle for problem solving (van Strien 1986 ), or the wheel of science from Babbie ( 2013 ). In fact, although we wrote “developed” in the previous sentence, we are fully aware of the need to further develop and refine the framework as is.

The importance of the MIR framework lies in the complex, multifaceted nature of issues like sustainability, food security and one world health. For progress in the study of these pressing issues the understanding, construction and quality of interdisciplinary portfolio measurements (Tobi 2014 ) are pivotal and require further study as well as procedures facilitating the integration across different disciplines.

Another important strain of further research relates to the continuum of Responsible Conduct of Research (RCR), Questionable Research Practices (QRP), and deliberate misconduct (Steneck 2006 ). QRP includes failing to report all of a study’s conditions, stopping collecting data earlier than planned because one found the result one had been looking for, etc. (e.g., John et al. 2012 ; Simmons et al. 2011 ; Kampen and Tamás 2014 ). A meta-analysis on selfreports obtained through surveys revealed that about 2% of researchers had admitted to research misconduct at least once, whereas up to 33% admitted to QRPs (Fanelli 2009 ). While the frequency of QRPs may easily eclipse that of deliberate fraud (John et al. 2012 ) these practices have received less attention than deliberate misconduct. Claimed research findings may often be accurate measures of the prevailing biases and methodological rigor in a research field (Fanelli and Ioannidis 2013 ; Fanelli 2010 ). If research misconduct and QRP are to be understood then the disciplinary context must be grasped as a locus of both legitimate and illegitimate activity (Fox 1990 ). It would be valuable to investigate how working in interdisciplinary teams and, consequently, exposure to other standards of QRP and RCR influence research integrity as the appropriate research behavior from the perspective of different professional standards (Steneck 2006 : p. 56). These differences in scientific cultures concern criteria for quality in design and execution of research, reporting (e.g., criteria for authorship of a paper, preferred publication outlets, citation practices, etc.), archiving and sharing of data, and so on.

Other strains of research include interdisciplinary collaboration and negotiation, where we expect contributions from the “science of team science” (Falk-Krzesinski et al. 2010 ); and compatibility of the MIR framework with new research paradigms such as “inclusive research” (a mode of research involving people with intellectual disabilities as more than just objects of research; e.g., Walmsley and Johnson 2003 ). Because of the complexity and novelty of inclusive health research a consensus statement was developed on how to conduct health research inclusively (Frankena et al., under review). The eight attributes of inclusive health research identified may also be taken as guiding attributes in the design of inclusive research according to the MIR framework. For starters, there is the possibility of inclusiveness in the conceptual framework, particularly in determining research objectives, and in discussing possible theoretical frameworks with team members with an intellectual disability which Frankena et al. labelled the “Designing the study” attribute. There are also opportunities for inclusiveness in the technical design, and in execution. For example, the inclusiveness attribute “generating data” overlaps with the operationalization and measurement instrument design/selection and the attribute “analyzing data” aligns with the data analysis plan in the technical design.

On a final note, we hope to have aroused the reader’s interest in, and to have demonstrated the need for, a methodology for interdisciplinary research design. We further hope that the MIR framework proposed and explained in this article helps those involved in designing an interdisciplinary research project to get a clearer view of the various processes that must be secured during the project’s design and execution. And we look forward to further collaboration with scientists from all cultures to contribute to improving the MIR framework and make interdisciplinary collaborations successful.

Acknowledgements

The MIR framework is the result of many discussions with students, researchers and colleagues, with special thanks to Peter Tamás, Jennifer Barrett, Loes Maas, Giel Dik, Ruud Zaalberg, Jurian Meijering, Vanessa Torres van Grinsven, Matthijs Brink, Gerda Casimir, and, last but not least, Jenneken Naaldenberg.

  • Aboelela SW, Larson E, Bakken S, Carrasquillo O, Formicola A, Glied SA, Gebbie KM. Defining interdisciplinary research: conclusions from a critical review of the literature. Health Serv. Res. 2007; 42 (1):329–346. doi: 10.1111/j.1475-6773.2006.00621.x. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Adamson J. Combined qualitative and quantitative designs. In: Bowling A, Ebrahim S, editors. Handbook of Health Research Methods: Investigation, Measurement and Analysis. Maidenhead: Open University Press; 2005. pp. 230–245. [ Google Scholar ]
  • Adler ES, Clark R. An Invitation to Social Research: How it’s Done. 4. London: Sage; 2011. [ Google Scholar ]
  • Babbie ER. The Practice of Social Research. 13. Belmont Ca: Wadsworth Cengage Learning; 2013. [ Google Scholar ]
  • Baker GH, Little RG. Enhancing homeland security: development of a course on critical infrastructure systems. J. Homel. Secur. Emerg. Manag. 2006 [ Google Scholar ]
  • Bishop FL. Using mixed methods research designs in health psychology: an illustrated discussion from a pragmatist perspective. Br. J. Health. Psychol. 2015; 20 (1):5–20. doi: 10.1111/bjhp.12122. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Boeije HR. Analysis in Qualitative Research. London: Sage; 2009. [ Google Scholar ]
  • Bruns D, van den Brink A, Tobi H, Bell S. Advancing landscape architecture research. In: van den Brink A, Bruns D, Tobi H, Bell S, editors. Research in Landscape Architecture: Methods And Methodology. New York: Routledge; 2017. pp. 11–23. [ Google Scholar ]
  • Creswell JW, Piano Clark VL. Designing and Conducting Mixed Methods Research. 2. Los Angeles: Sage; 2011. [ Google Scholar ]
  • Danner D, Blasius J, Breyer B, Eifler S, Menold N, Paulhus DL, Ziegler M. Current challenges, new developments, and future directions in scale construction. Eur. J. Psychol. Assess. 2016; 32 (3):175–180. doi: 10.1027/1015-5759/a000375. [ CrossRef ] [ Google Scholar ]
  • Deming ME, Swaffield S. Landscape Architecture Research. Hoboken: Wiley; 2011. [ Google Scholar ]
  • Departement Leefmilieu, Natuur en Energie: Uitvoeren van een uitgebreide schriftelijke enquête en een beperkte CAWI-enquête ter bepaling van het percentage gehinderden door geur, geluid en licht in Vlaanderen–SLO-3. Leuven: Market Analysis & Synthesis. www.lne.be/sites/default/files/atoms/files/lne-slo-3-eindrapport.pdf (2013). Accessed 8 March 2017
  • De Vaus D. Research Design in Social Research. London: Sage; 2001. [ Google Scholar ]
  • DeVellis RF. Scale Development: Theory and Applications. 3. Los Angeles: Sage; 2012. [ Google Scholar ]
  • Dillman DA. Mail and Internet Surveys. 2. Hobroken: Wiley; 2007. [ Google Scholar ]
  • Falk-Krzesinski HJ, Borner K, Contractor N, Fiore SM, Hall KL, Keyton J, Uzzi B, et al. Advancing the science of team science. CTS Clin. Transl. Sci. 2010; 3 (5):263–266. doi: 10.1111/j.1752-8062.2010.00223.x. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Fanelli D. How many scientists fabricate and falsify research? A systematic review and metaanalysis of survey data. PLoS ONE. 2009 [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Fanelli D. Positive results increase down the hierarchy of the sciences. PLoS ONE. 2010 [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Fanelli D, Ioannidis JPA. US studies may overestimate effect sizes in softer research. Proc. Natl. Acad. Sci. USA. 2013; 110 (37):15031–15036. doi: 10.1073/pnas.1302997110. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Fetters MD, Molina-Azorin JF. The journal of mixed methods research starts a new decade: principles for bringing in the new and divesting of the old language of the field. J. Mixed Methods Res. 2017; 11 (1):3–10. doi: 10.1177/1558689816682092. [ CrossRef ] [ Google Scholar ]
  • Fischer ARH, Tobi H, Ronteltap A. When natural met social: a review of collaboration between the natural and social sciences. Interdiscip. Sci. Rev. 2011; 36 (4):341–358. doi: 10.1179/030801811X13160755918688. [ CrossRef ] [ Google Scholar ]
  • Flick U. An Introduction to Qualitative Research. 3. London: Sage; 2006. [ Google Scholar ]
  • Fox MF. Fraud, ethics, and the disciplinary contexts of science and scholarship. Am. Sociol. 1990; 21 (1):67–71. doi: 10.1007/BF02691783. [ CrossRef ] [ Google Scholar ]
  • Frischknecht PM. Environmental science education at the Swiss Federal Institute of Technology (ETH) Water Sci. Technol. 2000; 41 (2):31–36. [ PubMed ] [ Google Scholar ]
  • Godfray HCJ, Beddington JR, Crute IR, Haddad L, Lawrence D, Muir JF, Pretty J, Robinson S, Thomas SM, Toulmin C. Food security: the challenge of feeding 9 billion people. Science. 2010; 327 (5967):812–818. doi: 10.1126/science.1185383. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Greene JC. Is mixed methods social inquiry a distinctive methodology? J. Mixed Methods Res. 2008; 2 (1):7–22. doi: 10.1177/1558689807309969. [ CrossRef ] [ Google Scholar ]
  • IPCC.: Climate Change 2014 Synthesis Report. Geneva: Intergovernmental Panel on Climate Change. www.ipcc.ch/pdf/assessment-report/ar5/syr/SYR_AR5_FINAL_full_wcover.pdf (2015) Accessed 8 March 2017
  • John LK, Loewenstein G, Prelec D. Measuring the prevalence of questionable research practices with incentives for truth telling. Psychol. Sci. 2012; 23 (5):524–532. doi: 10.1177/0956797611430953. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Kagan J. The Three Cultures: Natural Sciences, Social Sciences and the Humanities in the 21st Century. Cambridge: Cambridge University Press; 2009. [ Google Scholar ]
  • Kampen JK, Tamás P. Should I take this seriously? A simple checklist for calling bullshit on policy supporting research. Qual. Quant. 2014; 48 :1213–1223. doi: 10.1007/s11135-013-9830-8. [ CrossRef ] [ Google Scholar ]
  • Kumar R. Research Methodology: A Step-by-Step Guide for Beginners. 1. Los Angeles: Sage; 1999. [ Google Scholar ]
  • Kumar R. Research Methodology: A Step-by-Step Guide for Beginners. 4. Los Angeles: Sage; 2014. [ Google Scholar ]
  • Kvale S. Doing Interviews. London: Sage; 2007. [ Google Scholar ]
  • Kvale S, Brinkmann S. Interviews: Learning the Craft of Qualitative Interviews. 2. London: Sage; 2009. [ Google Scholar ]
  • Lenzholder S, Duchhart I, van den Brink A. The relationship between research design. In: van den Brink A, Bruns D, Tobi H, Bell S, editors. Research in Landscape Architecture: Methods and Methodology. New York: Routledge; 2017. pp. 54–64. [ Google Scholar ]
  • Molina-Azorin JF, Lopez-Gamero MD. Mixed methods studies in environmental management research: prevalence, purposes and designs. Bus. Strateg. Environ. 2016; 25 (2):134–148. doi: 10.1002/bse.1862. [ CrossRef ] [ Google Scholar ]
  • Onwuegbuzie AJ, Leech NL. Taking the “Q” out of research: teaching research methodology courses without the divide between quantitative and qualitative paradigms. Qual. Quant. 2005; 39 (3):267–296. doi: 10.1007/s11135-004-1670-0. [ CrossRef ] [ Google Scholar ]
  • Powell H, Mihalas S, Onwuegbuzie AJ, Suldo S, Daley CE. Mixed methods research in school psychology: a mixed methods investigation of trends in the literature. Psychol. Sch. 2008; 45 (4):291–309. doi: 10.1002/pits.20296. [ CrossRef ] [ Google Scholar ]
  • Shipley B. Cause and Correlation in Biology. 2. Cambridge: Cambridge University Press; 2016. [ Google Scholar ]
  • Simmons JP, Nelson LD, Simonsohn U. False positive psychology: undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychol. Sci. 2011; 22 :1359–1366. doi: 10.1177/0956797611417632. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Spelt EJH, Biemans HJA, Tobi H, Luning PA, Mulder M. Teaching and learning in interdisciplinary higher education: a systematic review. Educ. Psychol. Rev. 2009; 21 (4):365–378. doi: 10.1007/s10648-009-9113-z. [ CrossRef ] [ Google Scholar ]
  • Spradley JP. The Ethnographic Interview. New York: Holt, Rinehart and Winston; 1979. [ Google Scholar ]
  • Steneck NH. Fostering integrity in research: definitions, current knowledge, and future directions. Sci. Eng. Eth. 2006; 12 (1):53–74. doi: 10.1007/s11948-006-0006-y. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Tobi H. Measurement in interdisciplinary research: the contributions of widely-defined measurement and portfolio representations. Measurement. 2014; 48 :228–231. doi: 10.1016/j.measurement.2013.11.013. [ CrossRef ] [ Google Scholar ]
  • Tobi H, Kampen JK. Survey error in an international context: an empirical assessment of crosscultural differences regarding scale effects. Qual. Quant. 2013; 47 (1):553–559. doi: 10.1007/s11135-011-9476-3. [ CrossRef ] [ Google Scholar ]
  • Tobi H, van den Brink A. A process approach to research in landscape architecture. In: van den Brink A, Bruns D, Tobi H, Bell S, editors. Research in Landscape Architecture: Methods and Methodology. New York: Routledge; 2017. pp. 24–34. [ Google Scholar ]
  • van Strien PJ. Praktijk als wetenschap: Methodologie van het sociaal-wetenschappelijk handelen [Practice as science. Methodology of social scientific acting.] Assen: Van Gorcum; 1986. [ Google Scholar ]
  • Venkatesh V, Brown SA, Bala H. Bridging the qualitative-quantitative divide: guidelines for conducting mixed methods research in information systems. MIS Q. 2013; 37 (1):21–54. doi: 10.25300/MISQ/2013/37.1.02. [ CrossRef ] [ Google Scholar ]
  • Vuye C, Bergiers A, Vanhooreweder B. The acoustical durability of thin noise reducing asphalt layers. Coatings. 2016 [ Google Scholar ]
  • Walmsley J, Johnson K. Inclusive Research with People with Learning Disabilities: Past, Present and Futures. London: Jessica Kingsley; 2003. [ Google Scholar ]
  • Walsh WB, Smith GL, London M. Developing an interface between engineering and social sciences- interdisciplinary team-approach to solving societal problems. Am. Psychol. 1975; 30 (11):1067–1071. doi: 10.1037/0003-066X.30.11.1067. [ CrossRef ] [ Google Scholar ]

Numbers, Facts and Trends Shaping Your World

Read our research on:

Full Topic List

Regions & Countries

  • Publications
  • Our Methods
  • Short Reads
  • Tools & Resources

Read Our Research On:

U.S. Surveys

Pew Research Center has deep roots in U.S. public opinion research.  Launched initially  as a project focused primarily on U.S. policy and politics in the early 1990s, the Center has grown over time to study a wide range of topics vital to explaining America to itself and to the world. Our hallmarks: a rigorous approach to methodological quality, complete transparency as to our methods, and a commitment to exploring and evaluating ongoing developments in data collection. Learn more about how we conduct our domestic surveys  here .

The American Trends Panel

project research design and methodology

Try our email course on polling

Want to know more about polling? Take your knowledge to the next level with a short email mini-course from Pew Research Center. Sign up now .

From the 1980s until relatively recently, most national polling organizations conducted surveys by telephone, relying on live interviewers to call randomly selected Americans across the country. Then came the internet. While it took survey researchers some time to adapt to the idea of online surveys, a quick look at the public polls on an issue like presidential approval reveals a landscape now dominated by online polls rather than phone polls.

Most of our U.S. surveys are conducted on the American Trends Panel (ATP), Pew Research Center’s national survey panel of over 10,000 randomly selected U.S. adults. ATP participants are recruited offline using random sampling from the U.S. Postal Service’s residential address file. Survey length is capped at 15 minutes, and respondents are reimbursed for their time. Respondents complete the surveys online using smartphones, tablets or desktop devices. We provide tablets and data plans to adults without home internet. Learn more  about how people in the U.S. take Pew Research Center surveys.

project research design and methodology

Methods 101

Our video series helps explain the fundamental concepts of survey research including random sampling , question wording , mode effects , non probability surveys and how polling is done around. the world.

The Center also conducts custom surveys of special populations (e.g., Muslim Americans , Jewish Americans , Black Americans , Hispanic Americans , teenagers ) that are not readily studied using national, general population sampling. The Center’s survey research is sometimes paired with demographic or organic data to provide new insights. In addition to our U.S. survey research, you can also read more details on our  international survey research , our demographic research and our data science methods.

Our survey researchers are committed to contributing to the larger community of survey research professionals, and are active in AAPOR and is a charter member of the American Association of Public Opinion Research (AAPOR)  Transparency Initiative .

Frequently asked questions about surveys

  • Why am I never asked to take a poll?
  • Can I volunteer to be polled?
  • Why should I participate in surveys?
  • What good are polls?
  • Do pollsters have a code of ethics? If so, what is in the code?
  • How are your surveys different from market research?
  • Do you survey Asian Americans?
  • How are people selected for your polls?
  • Do people lie to pollsters?
  • Do people really have opinions on all of those questions?
  • How can I tell a high-quality poll from a lower-quality one?

Reports on the state of polling

  • Key Things to Know about Election Polling in the United States
  • A Field Guide to Polling: 2020 Edition
  • Confronting 2016 and 2020 Polling Limitations
  • What 2020’s Election Poll Errors Tell Us About the Accuracy of Issue Polling
  • Q&A: After misses in 2016 and 2020, does polling need to be fixed again? What our survey experts say
  • Understanding how 2020 election polls performed and what it might mean for other kinds of survey work
  • Can We Still Trust Polls?
  • Political Polls and the 2016 Election
  • Flashpoints in Polling: 2016

Sign up for our Methods newsletter

The latest on survey methods, data science and more, delivered quarterly.

OTHER RESEARCH METHODS

Sign up for our weekly newsletter.

Fresh data delivered Saturday mornings

1615 L St. NW, Suite 800 Washington, DC 20036 USA (+1) 202-419-4300 | Main (+1) 202-857-8562 | Fax (+1) 202-419-4372 |  Media Inquiries

Research Topics

  • Age & Generations
  • Coronavirus (COVID-19)
  • Economy & Work
  • Family & Relationships
  • Gender & LGBTQ
  • Immigration & Migration
  • International Affairs
  • Internet & Technology
  • Methodological Research
  • News Habits & Media
  • Non-U.S. Governments
  • Other Topics
  • Politics & Policy
  • Race & Ethnicity
  • Email Newsletters

ABOUT PEW RESEARCH CENTER  Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. It conducts public opinion polling, demographic research, media content analysis and other empirical social science research. Pew Research Center does not take policy positions. It is a subsidiary of  The Pew Charitable Trusts .

Copyright 2024 Pew Research Center

Terms & Conditions

Privacy Policy

Cookie Settings

Reprints, Permissions & Use Policy

Help | Advanced Search

Computer Science > Computation and Language

Title: automated text mining of experimental methodologies from biomedical literature.

Abstract: Biomedical literature is a rapidly expanding field of science and technology. Classification of biomedical texts is an essential part of biomedicine research, especially in the field of biology. This work proposes the fine-tuned DistilBERT, a methodology-specific, pre-trained generative classification language model for mining biomedicine texts. The model has proven its effectiveness in linguistic understanding capabilities and has reduced the size of BERT models by 40\% but by 60\% faster. The main objective of this project is to improve the model and assess the performance of the model compared to the non-fine-tuned model. We used DistilBert as a support model and pre-trained on a corpus of 32,000 abstracts and complete text articles; our results were impressive and surpassed those of traditional literature classification methods by using RNN or LSTM. Our aim is to integrate this highly specialised and specific model into different research industries.

Submission history

Access paper:.

  • HTML (experimental)
  • Other Formats

license icon

References & Citations

  • Google Scholar
  • Semantic Scholar

BibTeX formatted citation

BibSonomy logo

Bibliographic and Citation Tools

Code, data and media associated with this article, recommenders and search tools.

  • Institution

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs .

Massachusetts State Seal

Official websites use .mass.gov

Secure websites use HTTPS certificate

A lock icon ( ) or https:// means you’ve safely connected to the official website. Share sensitive information only on official, secure websites.

project research design and methodology

  • search    across the entire site
  • search  in Office of the Inspector General
  • This page, Making a Choice: Design-Bid-Build or Construction Management at Risk, is   offered by
  • Office of the Inspector General

Making a Choice: Design-Bid-Build or Construction Management at Risk

Contact   for making a choice: design-bid-build or construction management at risk, office of the inspector general - oig academy, overview   of making a choice: design-bid-build or construction management at risk.

This training will assist jurisdictions decide which delivery method might work best for their project. Topics include an overview of each project delivery method, the major procurement requirements of each, differences in oversight requirements, and suitability of projects to each delivery method.

This training aims to help public officials and others compare traditional construction bidding methods to the construction management at risk option for delivering public building projects in Massachusetts.  

  • Prerequisite:   None
  • Class Level: Basic
  • Instructional Method:  Online - OIG Academy Online Class Technical Specifications
  • Advanced Prep: None 
  • Credits: 3
  • Registration closes two weeks before the class start date.
  • Blackboard® login information other virtual platform login details will be emailed to registrants.

Class date and time:  June 11, 2024, 8:15 a.m. - 11:30 a.m.

Additional Resources   for Making a Choice: Design-Bid-Build or Construction Management at Risk

Upcoming events   for making a choice: design-bid-build or construction management at risk, understanding leadership - oig academy training webinar , ifb and rfp workshop - oig academy elective , help us improve mass.gov   with your feedback.

The feedback will only be used for improving the website. If you need assistance, please contact the Office of the Inspector General . Please limit your input to 500 characters.

Thank you for your website feedback! We will use this information to improve this page.

If you would like to continue helping us improve Mass.gov, join our user panel to test new features for the site.

IMAGES

  1. Types of Research Methodology: Uses, Types & Benefits

    project research design and methodology

  2. Research Design: Tips, Types and Examples

    project research design and methodology

  3. Infographic: Steps in the Research Process

    project research design and methodology

  4. How to Write a Research Design

    project research design and methodology

  5. 10 Types Of Research Design

    project research design and methodology

  6. Research Design and Methodology, Power Point Presentation With Speaker

    project research design and methodology

VIDEO

  1. WRITING THE CHAPTER 3|| Research Methodology (Research Design and Method)

  2. Research Design, Research Method: What's the Difference?

  3. Main Points of Abstract

  4. Research Lecture 2 Research Methodology

  5. Research Design, Methodology & Methods

  6. Research Methodology: Classification of Research Based on Time

COMMENTS

  1. What Is a Research Design

    A research design is a strategy for answering your research question using empirical data. Creating a research design means making decisions about: Your overall research objectives and approach. Whether you'll rely on primary research or secondary research. Your sampling methods or criteria for selecting subjects. Your data collection methods.

  2. Research Design

    This will guide your research design and help you select appropriate methods. Select a research design: There are many different research designs to choose from, including experimental, survey, case study, and qualitative designs. Choose a design that best fits your research question and objectives.

  3. Research Design

    Step 1: Consider your aims and approach. Step 2: Choose a type of research design. Step 3: Identify your population and sampling method. Step 4: Choose your data collection methods. Step 5: Plan your data collection procedures. Step 6: Decide on your data analysis strategies. Frequently asked questions.

  4. What Is Research Design? 8 Types + Examples

    Research design refers to the overall plan, structure or strategy that guides a research project, from its conception to the final analysis of data. Research designs for quantitative studies include descriptive, correlational, experimental and quasi-experimenta l designs. Research designs for qualitative studies include phenomenological ...

  5. Research Methodology

    Research methodology formats can vary depending on the specific requirements of the research project, but the following is a basic example of a structure for a research methodology section: ... Research methodology helps in designing the research by selecting the appropriate research method, research design, and sampling technique.

  6. What is a Research Design? Definition, Types, Methods and Examples

    Research design methods refer to the systematic approaches and techniques used to plan, structure, and conduct a research study. The choice of research design method depends on the research questions, objectives, and the nature of the study. Here are some key research design methods commonly used in various fields: 1.

  7. What Is a Research Methodology?

    Step 1: Explain your methodological approach. Step 2: Describe your data collection methods. Step 3: Describe your analysis method. Step 4: Evaluate and justify the methodological choices you made. Tips for writing a strong methodology chapter. Other interesting articles. Frequently asked questions about methodology.

  8. Research Methods Guide: Research Design & Method

    Research design is a plan to answer your research question. A research method is a strategy used to implement that plan. Research design and methods are different but closely related, because good research design ensures that the data you obtain will help you answer your research question more effectively. Which research method should I choose?

  9. What Is Research Methodology? Definition + Examples

    As we mentioned, research methodology refers to the collection of practical decisions regarding what data you'll collect, from who, how you'll collect it and how you'll analyse it. Research design, on the other hand, is more about the overall strategy you'll adopt in your study. For example, whether you'll use an experimental design ...

  10. Design Your Research Project

    Designing and Proposing Your Research Project (Print Book) by Jennifer Brown Urban; Bradley Matheus Van Eeden-Moorefield. Publication Date: 2017. Designing a study and writing up a research proposal takes time--often more time than actually conducting the study! This practical guide will save you time and frustration by walking you through ...

  11. PDF Research Design and Research Methods

    Research Design and Research Methods 47 research design link your purposes to the broader, more theoretical aspects of procedures for conducting Qualitative, Quantitative, and Mixed Methods ... when you set up your procedures in a survey project or an intervention, you want to be sure that other researchers who use similar procedures will reach

  12. Project design and methodology

    This project was initially designed by a group of academic researchers and research partners drawn from the Service User and Carer Involvement in Research (SUCIR) group at the University of the West of England (UWE). An outline application was submitted in May 2010 to a joint funding call for proposals from the NIHR Health Services and Delivery Research programme (HS&DR) and INVOLVE on public ...

  13. Research Project

    This plan outlines the methodology that will be used to collect and analyze data, including the research design, sampling strategy, data collection methods, and data analysis techniques. ... Rigorous methodology: A research project should employ a rigorous methodology that is appropriate for the research question being investigated. This may ...

  14. How To Choose The Right Research Methodology

    To choose the right research methodology for your dissertation or thesis, you need to consider three important factors. Based on these three factors, you can decide on your overarching approach - qualitative, quantitative or mixed methods. Once you've made that decision, you can flesh out the finer details of your methodology, such as the ...

  15. How to Choose the Best Research Design and Methodology

    Research design is the overall plan or strategy that guides your research project. It specifies the objectives, questions, hypotheses, variables, methods, and procedures that you will use to ...

  16. What is Research Methodology? Definition, Types, and Examples

    Definition, Types, and Examples. Research methodology 1,2 is a structured and scientific approach used to collect, analyze, and interpret quantitative or qualitative data to answer research questions or test hypotheses. A research methodology is like a plan for carrying out research and helps keep researchers on track by limiting the scope of ...

  17. Planning Qualitative Research: Design and Decision Making for New

    While many books and articles guide various qualitative research methods and analyses, there is currently no concise resource that explains and differentiates among the most common qualitative approaches. We believe novice qualitative researchers, students planning the design of a qualitative study or taking an introductory qualitative research course, and faculty teaching such courses can ...

  18. Research Methods

    Research methods are specific procedures for collecting and analyzing data. Developing your research methods is an integral part of your research design. When planning your methods, there are two key decisions you will make. First, decide how you will collect data. Your methods depend on what type of data you need to answer your research question:

  19. (PDF) Research Design and Methodology

    Research methodology implies the path through which the researcher intends to conduct research and the path to follow while formulating problems and objectives and presenting the result from data ...

  20. Research design: the methodology for interdisciplinary research

    Research as a process in the methodology in interdisciplinary research framework. The Methodology for Interdisciplinary Research (MIR) framework was built on the process approach (Kumar 1999), because in the process approach, the research question or hypothesis is leading for all decisions in the various stages of research.That means that it helps the MIR framework to put the common goal of ...

  21. (PDF) CHAPTER FIVE RESEARCH DESIGN AND METHODOLOGY 5.1. Introduction

    Research Design A research design is the 'procedures for collecting, analyzing, interpreting and reporting data in research studies' (Creswell & Plano Clark 2007, p.58).

  22. U.S. Surveys

    Pew Research Center has deep roots in U.S. public opinion research. Launched initially as a project focused primarily on U.S. policy and politics in the early 1990s, the Center has grown over time to study a wide range of topics vital to explaining America to itself and to the world.Our hallmarks: a rigorous approach to methodological quality, complete transparency as to our methods, and a ...

  23. How to Write a Research Proposal

    Research design and methods. Following the literature review, restate your main objectives. This brings the focus back to your own project. Next, your research design or methodology section will describe your overall approach, and the practical steps you will take to answer your research questions.

  24. Revisiting UX with Google Certificate: UX Research Methods

    Research conducted after the product launch aims to validate whether the solution successfully addresses the initial problems. In this stage, quantitative research can play a significant role, as product metrics are often used to measure the success of the project. Research methods: A/B testing; Usability studies; Surveys; Logs analysis; 2.

  25. [2404.13779] Automated Text Mining of Experimental Methodologies from

    Biomedical literature is a rapidly expanding field of science and technology. Classification of biomedical texts is an essential part of biomedicine research, especially in the field of biology. This work proposes the fine-tuned DistilBERT, a methodology-specific, pre-trained generative classification language model for mining biomedicine texts. The model has proven its effectiveness in ...

  26. Seattle eyes exempting some center city projects from design review

    The overall time for commercial and multifamily projects to move through Seattle's design review process is 24.3 months, according to the city. Seattle Mayor Bruce Harrell's administration on ...

  27. Making a Choice: Design-Bid-Build or Construction Management at Risk

    Fees: $99. This training will assist jurisdictions decide which delivery method might work best for their project. Topics include an overview of each project delivery method, the major procurement requirements of each, differences in oversight requirements, and suitability of projects to each delivery method.