The global body for professional accountants

  • Search jobs
  • Find an accountant
  • Technical activities
  • Help & support

Can't find your location/region listed? Please visit our global website instead

  • Middle East
  • Cayman Islands
  • Trinidad & Tobago
  • Virgin Islands (British)
  • United Kingdom
  • Czech Republic
  • United Arab Emirates
  • Saudi Arabia
  • State of Palestine
  • Syrian Arab Republic
  • South Africa
  • Africa (other)
  • Hong Kong SAR of China
  • New Zealand
  • Our qualifications
  • Getting started
  • Your career
  • Apply to become an ACCA student
  • Why choose to study ACCA?
  • ACCA accountancy qualifications
  • Getting started with ACCA
  • ACCA Learning
  • Register your interest in ACCA
  • Learn why you should hire ACCA members
  • Why train your staff with ACCA?
  • Recruit finance staff
  • Train and develop finance talent
  • Approved Employer programme
  • Employer support
  • Resources to help your organisation stay one step ahead
  • Support for Approved Learning Partners
  • Becoming an ACCA Approved Learning Partner
  • Tutor support
  • Computer-Based Exam (CBE) centres
  • Content providers
  • Registered Learning Partner
  • Exemption accreditation
  • University partnerships
  • Find tuition
  • Virtual classroom support for learning partners
  • Find CPD resources
  • Your membership
  • Member networks
  • AB magazine
  • Sectors and industries
  • Regulation and standards
  • Advocacy and mentoring
  • Council, elections and AGM
  • Tuition and study options
  • Study support resources
  • Practical experience
  • Our ethics modules
  • Student Accountant
  • Regulation and standards for students
  • Your 2024 subscription
  • Completing your EPSM
  • Completing your PER
  • Apply for membership
  • Skills webinars
  • Finding a great supervisor
  • Choosing the right objectives for you
  • Regularly recording your PER
  • The next phase of your journey
  • Your future once qualified
  • Mentoring and networks
  • Advance e-magazine
  • Affiliate video support
  • About policy and insights at ACCA
  • Meet the team
  • Global economics
  • Professional accountants - the future
  • Supporting the global profession
  • Download the insights app

Can't find your location listed? Please visit our global website instead

  • Research and Analysis Project (RAP) submission
  • Oxford Brookes BSc
  • Back to Oxford Brookes BSc
  • Eligibility for the BSc degree
  • Assessment of the degree
  • The role of the project mentor
  • RAP submission and Ethics and Professional Skills module dates

Resubmitting your RAP

  • Rules and regulations
  • Celebrating success
  • FAQs about the BSc degree
  • Completion deadline - 2026

The opportunity to complete the BSc (Hons) in Applied Accounting will close in May 2026.

RAP submissions

For more information on our final submission deadlines ahead of the programme closing, please visit our completion deadline information.

Access completion deadline page

Research and Analysis Project submissions and re-submissions should all be made via an online registration page. You will be required to complete your personal details online as well as entering your Project Mentor details, making payment and upload your documents.

You must pay a project submission fee to Oxford Brookes with any project submission. The Oxford Brookes Research and Analysis Project (RAP) submission fee for period 45 is £470 and period 46 is £470. 

The link will close at 12.00 (midnight) GMT on the day stated on the project submission dates page. Should you experience any difficulties uploading your Research and Analysis Project, please contact the ACCA Office at Oxford Brookes via email at [email protected] When you submit your Research and Analysis Project, it is assumed you have read the current version of the Information Pack.

Please do not send hard copies as these will not be accepted.

You can upload your Research and Analysis Project in all major file formats, see below for details. You should upload separate files for your Research Report, Skills and Learning Statement, Appendices and List of References; these will all then be submitted along with your payment.

PLEASE NOTE

It is your responsibility to ensure that you are eligible to submit a Research and Analysis Project. You must successfully completed the Applied Knowledge and Applied Skills exams, and completed the Ethics and Professional Skills module before submitting a Research and Analysis Project to Oxford Brookes.

If you currently have Conditional Exemptions you are not eligible to submit a RAP.

If you have any queries regarding your Oxford Brookes status and eligibility, please contact [email protected] .

Acceptable formats for uploading your RAP 

Rap submission dates.

Visit our RAP submission dates section for more information

Visit our Resubmitting your RAP section for more information

Update on BSc

The opportunity for students and members to complete the BSc (Hons) in Applied Accounting from Oxford Brookes University is available for a limited time only.

Related Links

  • Oxford Brookes University online submission website

Advertisement

  • ACCA Careers
  • ACCA Career Navigator
  • ACCA Learning Community

Useful links

  • Make a payment
  • ACCA-X online courses
  • ACCA Rulebook
  • Work for us

Most popular

  • Professional insights
  • ACCA Qualification
  • Member events and CPD
  • Supporting Ukraine
  • Past exam papers

Connect with us

Planned system updates.

  • Accessibility
  • Legal policies
  • Data protection & cookies
  • Advertising
  • Privacy Policy

Research Method

Home » Research Project – Definition, Writing Guide and Ideas

Research Project – Definition, Writing Guide and Ideas

Table of Contents

Research Project

Research Project

Definition :

Research Project is a planned and systematic investigation into a specific area of interest or problem, with the goal of generating new knowledge, insights, or solutions. It typically involves identifying a research question or hypothesis, designing a study to test it, collecting and analyzing data, and drawing conclusions based on the findings.

Types of Research Project

Types of Research Projects are as follows:

Basic Research

This type of research focuses on advancing knowledge and understanding of a subject area or phenomenon, without any specific application or practical use in mind. The primary goal is to expand scientific or theoretical knowledge in a particular field.

Applied Research

Applied research is aimed at solving practical problems or addressing specific issues. This type of research seeks to develop solutions or improve existing products, services or processes.

Action Research

Action research is conducted by practitioners and aimed at solving specific problems or improving practices in a particular context. It involves collaboration between researchers and practitioners, and often involves iterative cycles of data collection and analysis, with the goal of improving practices.

Quantitative Research

This type of research uses numerical data to investigate relationships between variables or to test hypotheses. It typically involves large-scale data collection through surveys, experiments, or secondary data analysis.

Qualitative Research

Qualitative research focuses on understanding and interpreting phenomena from the perspective of the people involved. It involves collecting and analyzing data in the form of text, images, or other non-numerical forms.

Mixed Methods Research

Mixed methods research combines elements of both quantitative and qualitative research, using multiple data sources and methods to gain a more comprehensive understanding of a phenomenon.

Longitudinal Research

This type of research involves studying a group of individuals or phenomena over an extended period of time, often years or decades. It is useful for understanding changes and developments over time.

Case Study Research

Case study research involves in-depth investigation of a particular case or phenomenon, often within a specific context. It is useful for understanding complex phenomena in their real-life settings.

Participatory Research

Participatory research involves active involvement of the people or communities being studied in the research process. It emphasizes collaboration, empowerment, and the co-production of knowledge.

Research Project Methodology

Research Project Methodology refers to the process of conducting research in an organized and systematic manner to answer a specific research question or to test a hypothesis. A well-designed research project methodology ensures that the research is rigorous, valid, and reliable, and that the findings are meaningful and can be used to inform decision-making.

There are several steps involved in research project methodology, which are described below:

Define the Research Question

The first step in any research project is to clearly define the research question or problem. This involves identifying the purpose of the research, the scope of the research, and the key variables that will be studied.

Develop a Research Plan

Once the research question has been defined, the next step is to develop a research plan. This plan outlines the methodology that will be used to collect and analyze data, including the research design, sampling strategy, data collection methods, and data analysis techniques.

Collect Data

The data collection phase involves gathering information through various methods, such as surveys, interviews, observations, experiments, or secondary data analysis. The data collected should be relevant to the research question and should be of sufficient quantity and quality to enable meaningful analysis.

Analyze Data

Once the data has been collected, it is analyzed using appropriate statistical techniques or other methods. The analysis should be guided by the research question and should aim to identify patterns, trends, relationships, or other insights that can inform the research findings.

Interpret and Report Findings

The final step in the research project methodology is to interpret the findings and report them in a clear and concise manner. This involves summarizing the results, discussing their implications, and drawing conclusions that can be used to inform decision-making.

Research Project Writing Guide

Here are some guidelines to help you in writing a successful research project:

  • Choose a topic: Choose a topic that you are interested in and that is relevant to your field of study. It is important to choose a topic that is specific and focused enough to allow for in-depth research and analysis.
  • Conduct a literature review : Conduct a thorough review of the existing research on your topic. This will help you to identify gaps in the literature and to develop a research question or hypothesis.
  • Develop a research question or hypothesis : Based on your literature review, develop a clear research question or hypothesis that you will investigate in your study.
  • Design your study: Choose an appropriate research design and methodology to answer your research question or test your hypothesis. This may include choosing a sample, selecting measures or instruments, and determining data collection methods.
  • Collect data: Collect data using your chosen methods and instruments. Be sure to follow ethical guidelines and obtain informed consent from participants if necessary.
  • Analyze data: Analyze your data using appropriate statistical or qualitative methods. Be sure to clearly report your findings and provide interpretations based on your research question or hypothesis.
  • Discuss your findings : Discuss your findings in the context of the existing literature and your research question or hypothesis. Identify any limitations or implications of your study and suggest directions for future research.
  • Write your project: Write your research project in a clear and organized manner, following the appropriate format and style guidelines for your field of study. Be sure to include an introduction, literature review, methodology, results, discussion, and conclusion.
  • Revise and edit: Revise and edit your project for clarity, coherence, and accuracy. Be sure to proofread for spelling, grammar, and formatting errors.
  • Cite your sources: Cite your sources accurately and appropriately using the appropriate citation style for your field of study.

Examples of Research Projects

Some Examples of Research Projects are as follows:

  • Investigating the effects of a new medication on patients with a particular disease or condition.
  • Exploring the impact of exercise on mental health and well-being.
  • Studying the effectiveness of a new teaching method in improving student learning outcomes.
  • Examining the impact of social media on political participation and engagement.
  • Investigating the efficacy of a new therapy for a specific mental health disorder.
  • Exploring the use of renewable energy sources in reducing carbon emissions and mitigating climate change.
  • Studying the effects of a new agricultural technique on crop yields and environmental sustainability.
  • Investigating the effectiveness of a new technology in improving business productivity and efficiency.
  • Examining the impact of a new public policy on social inequality and access to resources.
  • Exploring the factors that influence consumer behavior in a specific market.

Characteristics of Research Project

Here are some of the characteristics that are often associated with research projects:

  • Clear objective: A research project is designed to answer a specific question or solve a particular problem. The objective of the research should be clearly defined from the outset.
  • Systematic approach: A research project is typically carried out using a structured and systematic approach that involves careful planning, data collection, analysis, and interpretation.
  • Rigorous methodology: A research project should employ a rigorous methodology that is appropriate for the research question being investigated. This may involve the use of statistical analysis, surveys, experiments, or other methods.
  • Data collection : A research project involves collecting data from a variety of sources, including primary sources (such as surveys or experiments) and secondary sources (such as published literature or databases).
  • Analysis and interpretation : Once the data has been collected, it needs to be analyzed and interpreted. This involves using statistical techniques or other methods to identify patterns or relationships in the data.
  • Conclusion and implications : A research project should lead to a clear conclusion that answers the research question. It should also identify the implications of the findings for future research or practice.
  • Communication: The results of the research project should be communicated clearly and effectively, using appropriate language and visual aids, to a range of audiences, including peers, stakeholders, and the wider public.

Importance of Research Project

Research projects are an essential part of the process of generating new knowledge and advancing our understanding of various fields of study. Here are some of the key reasons why research projects are important:

  • Advancing knowledge : Research projects are designed to generate new knowledge and insights into particular topics or questions. This knowledge can be used to inform policies, practices, and decision-making processes across a range of fields.
  • Solving problems: Research projects can help to identify solutions to real-world problems by providing a better understanding of the causes and effects of particular issues.
  • Developing new technologies: Research projects can lead to the development of new technologies or products that can improve people’s lives or address societal challenges.
  • Improving health outcomes: Research projects can contribute to improving health outcomes by identifying new treatments, diagnostic tools, or preventive strategies.
  • Enhancing education: Research projects can enhance education by providing new insights into teaching and learning methods, curriculum development, and student learning outcomes.
  • Informing public policy : Research projects can inform public policy by providing evidence-based recommendations and guidance on issues related to health, education, environment, social justice, and other areas.
  • Enhancing professional development : Research projects can enhance the professional development of researchers by providing opportunities to develop new skills, collaborate with colleagues, and share knowledge with others.

Research Project Ideas

Following are some Research Project Ideas:

Field: Psychology

  • Investigating the impact of social support on coping strategies among individuals with chronic illnesses.
  • Exploring the relationship between childhood trauma and adult attachment styles.
  • Examining the effects of exercise on cognitive function and brain health in older adults.
  • Investigating the impact of sleep deprivation on decision making and risk-taking behavior.
  • Exploring the relationship between personality traits and leadership styles in the workplace.
  • Examining the effectiveness of cognitive-behavioral therapy (CBT) for treating anxiety disorders.
  • Investigating the relationship between social comparison and body dissatisfaction in young women.
  • Exploring the impact of parenting styles on children’s emotional regulation and behavior.
  • Investigating the effectiveness of mindfulness-based interventions for treating depression.
  • Examining the relationship between childhood adversity and later-life health outcomes.

Field: Economics

  • Analyzing the impact of trade agreements on economic growth in developing countries.
  • Examining the effects of tax policy on income distribution and poverty reduction.
  • Investigating the relationship between foreign aid and economic development in low-income countries.
  • Exploring the impact of globalization on labor markets and job displacement.
  • Analyzing the impact of minimum wage laws on employment and income levels.
  • Investigating the effectiveness of monetary policy in managing inflation and unemployment.
  • Examining the relationship between economic freedom and entrepreneurship.
  • Analyzing the impact of income inequality on social mobility and economic opportunity.
  • Investigating the role of education in economic development.
  • Examining the effectiveness of different healthcare financing systems in promoting health equity.

Field: Sociology

  • Investigating the impact of social media on political polarization and civic engagement.
  • Examining the effects of neighborhood characteristics on health outcomes.
  • Analyzing the impact of immigration policies on social integration and cultural diversity.
  • Investigating the relationship between social support and mental health outcomes in older adults.
  • Exploring the impact of income inequality on social cohesion and trust.
  • Analyzing the effects of gender and race discrimination on career advancement and pay equity.
  • Investigating the relationship between social networks and health behaviors.
  • Examining the effectiveness of community-based interventions for reducing crime and violence.
  • Analyzing the impact of social class on cultural consumption and taste.
  • Investigating the relationship between religious affiliation and social attitudes.

Field: Computer Science

  • Developing an algorithm for detecting fake news on social media.
  • Investigating the effectiveness of different machine learning algorithms for image recognition.
  • Developing a natural language processing tool for sentiment analysis of customer reviews.
  • Analyzing the security implications of blockchain technology for online transactions.
  • Investigating the effectiveness of different recommendation algorithms for personalized advertising.
  • Developing an artificial intelligence chatbot for mental health counseling.
  • Investigating the effectiveness of different algorithms for optimizing online advertising campaigns.
  • Developing a machine learning model for predicting consumer behavior in online marketplaces.
  • Analyzing the privacy implications of different data sharing policies for online platforms.
  • Investigating the effectiveness of different algorithms for predicting stock market trends.

Field: Education

  • Investigating the impact of teacher-student relationships on academic achievement.
  • Analyzing the effectiveness of different pedagogical approaches for promoting student engagement and motivation.
  • Examining the effects of school choice policies on academic achievement and social mobility.
  • Investigating the impact of technology on learning outcomes and academic achievement.
  • Analyzing the effects of school funding disparities on educational equity and achievement gaps.
  • Investigating the relationship between school climate and student mental health outcomes.
  • Examining the effectiveness of different teaching strategies for promoting critical thinking and problem-solving skills.
  • Investigating the impact of social-emotional learning programs on student behavior and academic achievement.
  • Analyzing the effects of standardized testing on student motivation and academic achievement.

Field: Environmental Science

  • Investigating the impact of climate change on species distribution and biodiversity.
  • Analyzing the effectiveness of different renewable energy technologies in reducing carbon emissions.
  • Examining the impact of air pollution on human health outcomes.
  • Investigating the relationship between urbanization and deforestation in developing countries.
  • Analyzing the effects of ocean acidification on marine ecosystems and biodiversity.
  • Investigating the impact of land use change on soil fertility and ecosystem services.
  • Analyzing the effectiveness of different conservation policies and programs for protecting endangered species and habitats.
  • Investigating the relationship between climate change and water resources in arid regions.
  • Examining the impact of plastic pollution on marine ecosystems and biodiversity.
  • Investigating the effects of different agricultural practices on soil health and nutrient cycling.

Field: Linguistics

  • Analyzing the impact of language diversity on social integration and cultural identity.
  • Investigating the relationship between language and cognition in bilingual individuals.
  • Examining the effects of language contact and language change on linguistic diversity.
  • Investigating the role of language in shaping cultural norms and values.
  • Analyzing the effectiveness of different language teaching methodologies for second language acquisition.
  • Investigating the relationship between language proficiency and academic achievement.
  • Examining the impact of language policy on language use and language attitudes.
  • Investigating the role of language in shaping gender and social identities.
  • Analyzing the effects of dialect contact on language variation and change.
  • Investigating the relationship between language and emotion expression.

Field: Political Science

  • Analyzing the impact of electoral systems on women’s political representation.
  • Investigating the relationship between political ideology and attitudes towards immigration.
  • Examining the effects of political polarization on democratic institutions and political stability.
  • Investigating the impact of social media on political participation and civic engagement.
  • Analyzing the effects of authoritarianism on human rights and civil liberties.
  • Investigating the relationship between public opinion and foreign policy decisions.
  • Examining the impact of international organizations on global governance and cooperation.
  • Investigating the effectiveness of different conflict resolution strategies in resolving ethnic and religious conflicts.
  • Analyzing the effects of corruption on economic development and political stability.
  • Investigating the role of international law in regulating global governance and human rights.

Field: Medicine

  • Investigating the impact of lifestyle factors on chronic disease risk and prevention.
  • Examining the effectiveness of different treatment approaches for mental health disorders.
  • Investigating the relationship between genetics and disease susceptibility.
  • Analyzing the effects of social determinants of health on health outcomes and health disparities.
  • Investigating the impact of different healthcare delivery models on patient outcomes and cost effectiveness.
  • Examining the effectiveness of different prevention and treatment strategies for infectious diseases.
  • Investigating the relationship between healthcare provider communication skills and patient satisfaction and outcomes.
  • Analyzing the effects of medical error and patient safety on healthcare quality and outcomes.
  • Investigating the impact of different pharmaceutical pricing policies on access to essential medicines.
  • Examining the effectiveness of different rehabilitation approaches for improving function and quality of life in individuals with disabilities.

Field: Anthropology

  • Analyzing the impact of colonialism on indigenous cultures and identities.
  • Investigating the relationship between cultural practices and health outcomes in different populations.
  • Examining the effects of globalization on cultural diversity and cultural exchange.
  • Investigating the role of language in cultural transmission and preservation.
  • Analyzing the effects of cultural contact on cultural change and adaptation.
  • Investigating the impact of different migration policies on immigrant integration and acculturation.
  • Examining the role of gender and sexuality in cultural norms and values.
  • Investigating the impact of cultural heritage preservation on tourism and economic development.
  • Analyzing the effects of cultural revitalization movements on indigenous communities.

About the author

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

Tables in Research Paper

Tables in Research Paper – Types, Creating Guide...

Research Summary

Research Summary – Structure, Examples and...

Literature Review

Literature Review – Types Writing Guide and...

Research Gap

Research Gap – Types, Examples and How to...

Significance of the Study

Significance of the Study – Examples and Writing...

Research Results

Research Results Section – Writing Guide and...

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, automatically generate references for free.

  • Knowledge Base
  • Methodology

Research Design | Step-by-Step Guide with Examples

Published on 5 May 2022 by Shona McCombes . Revised on 20 March 2023.

A research design is a strategy for answering your research question  using empirical data. Creating a research design means making decisions about:

  • Your overall aims and approach
  • The type of research design you’ll use
  • Your sampling methods or criteria for selecting subjects
  • Your data collection methods
  • The procedures you’ll follow to collect data
  • Your data analysis methods

A well-planned research design helps ensure that your methods match your research aims and that you use the right kind of analysis for your data.

Table of contents

Step 1: consider your aims and approach, step 2: choose a type of research design, step 3: identify your population and sampling method, step 4: choose your data collection methods, step 5: plan your data collection procedures, step 6: decide on your data analysis strategies, frequently asked questions.

  • Introduction

Before you can start designing your research, you should already have a clear idea of the research question you want to investigate.

There are many different ways you could go about answering this question. Your research design choices should be driven by your aims and priorities – start by thinking carefully about what you want to achieve.

The first choice you need to make is whether you’ll take a qualitative or quantitative approach.

Qualitative research designs tend to be more flexible and inductive , allowing you to adjust your approach based on what you find throughout the research process.

Quantitative research designs tend to be more fixed and deductive , with variables and hypotheses clearly defined in advance of data collection.

It’s also possible to use a mixed methods design that integrates aspects of both approaches. By combining qualitative and quantitative insights, you can gain a more complete picture of the problem you’re studying and strengthen the credibility of your conclusions.

Practical and ethical considerations when designing research

As well as scientific considerations, you need to think practically when designing your research. If your research involves people or animals, you also need to consider research ethics .

  • How much time do you have to collect data and write up the research?
  • Will you be able to gain access to the data you need (e.g., by travelling to a specific location or contacting specific people)?
  • Do you have the necessary research skills (e.g., statistical analysis or interview techniques)?
  • Will you need ethical approval ?

At each stage of the research design process, make sure that your choices are practically feasible.

Prevent plagiarism, run a free check.

Within both qualitative and quantitative approaches, there are several types of research design to choose from. Each type provides a framework for the overall shape of your research.

Types of quantitative research designs

Quantitative designs can be split into four main types. Experimental and   quasi-experimental designs allow you to test cause-and-effect relationships, while descriptive and correlational designs allow you to measure variables and describe relationships between them.

With descriptive and correlational designs, you can get a clear picture of characteristics, trends, and relationships as they exist in the real world. However, you can’t draw conclusions about cause and effect (because correlation doesn’t imply causation ).

Experiments are the strongest way to test cause-and-effect relationships without the risk of other variables influencing the results. However, their controlled conditions may not always reflect how things work in the real world. They’re often also more difficult and expensive to implement.

Types of qualitative research designs

Qualitative designs are less strictly defined. This approach is about gaining a rich, detailed understanding of a specific context or phenomenon, and you can often be more creative and flexible in designing your research.

The table below shows some common types of qualitative design. They often have similar approaches in terms of data collection, but focus on different aspects when analysing the data.

Your research design should clearly define who or what your research will focus on, and how you’ll go about choosing your participants or subjects.

In research, a population is the entire group that you want to draw conclusions about, while a sample is the smaller group of individuals you’ll actually collect data from.

Defining the population

A population can be made up of anything you want to study – plants, animals, organisations, texts, countries, etc. In the social sciences, it most often refers to a group of people.

For example, will you focus on people from a specific demographic, region, or background? Are you interested in people with a certain job or medical condition, or users of a particular product?

The more precisely you define your population, the easier it will be to gather a representative sample.

Sampling methods

Even with a narrowly defined population, it’s rarely possible to collect data from every individual. Instead, you’ll collect data from a sample.

To select a sample, there are two main approaches: probability sampling and non-probability sampling . The sampling method you use affects how confidently you can generalise your results to the population as a whole.

Probability sampling is the most statistically valid option, but it’s often difficult to achieve unless you’re dealing with a very small and accessible population.

For practical reasons, many studies use non-probability sampling, but it’s important to be aware of the limitations and carefully consider potential biases. You should always make an effort to gather a sample that’s as representative as possible of the population.

Case selection in qualitative research

In some types of qualitative designs, sampling may not be relevant.

For example, in an ethnography or a case study, your aim is to deeply understand a specific context, not to generalise to a population. Instead of sampling, you may simply aim to collect as much data as possible about the context you are studying.

In these types of design, you still have to carefully consider your choice of case or community. You should have a clear rationale for why this particular case is suitable for answering your research question.

For example, you might choose a case study that reveals an unusual or neglected aspect of your research problem, or you might choose several very similar or very different cases in order to compare them.

Data collection methods are ways of directly measuring variables and gathering information. They allow you to gain first-hand knowledge and original insights into your research problem.

You can choose just one data collection method, or use several methods in the same study.

Survey methods

Surveys allow you to collect data about opinions, behaviours, experiences, and characteristics by asking people directly. There are two main survey methods to choose from: questionnaires and interviews.

Observation methods

Observations allow you to collect data unobtrusively, observing characteristics, behaviours, or social interactions without relying on self-reporting.

Observations may be conducted in real time, taking notes as you observe, or you might make audiovisual recordings for later analysis. They can be qualitative or quantitative.

Other methods of data collection

There are many other ways you might collect data depending on your field and topic.

If you’re not sure which methods will work best for your research design, try reading some papers in your field to see what data collection methods they used.

Secondary data

If you don’t have the time or resources to collect data from the population you’re interested in, you can also choose to use secondary data that other researchers already collected – for example, datasets from government surveys or previous studies on your topic.

With this raw data, you can do your own analysis to answer new research questions that weren’t addressed by the original study.

Using secondary data can expand the scope of your research, as you may be able to access much larger and more varied samples than you could collect yourself.

However, it also means you don’t have any control over which variables to measure or how to measure them, so the conclusions you can draw may be limited.

As well as deciding on your methods, you need to plan exactly how you’ll use these methods to collect data that’s consistent, accurate, and unbiased.

Planning systematic procedures is especially important in quantitative research, where you need to precisely define your variables and ensure your measurements are reliable and valid.

Operationalisation

Some variables, like height or age, are easily measured. But often you’ll be dealing with more abstract concepts, like satisfaction, anxiety, or competence. Operationalisation means turning these fuzzy ideas into measurable indicators.

If you’re using observations , which events or actions will you count?

If you’re using surveys , which questions will you ask and what range of responses will be offered?

You may also choose to use or adapt existing materials designed to measure the concept you’re interested in – for example, questionnaires or inventories whose reliability and validity has already been established.

Reliability and validity

Reliability means your results can be consistently reproduced , while validity means that you’re actually measuring the concept you’re interested in.

For valid and reliable results, your measurement materials should be thoroughly researched and carefully designed. Plan your procedures to make sure you carry out the same steps in the same way for each participant.

If you’re developing a new questionnaire or other instrument to measure a specific concept, running a pilot study allows you to check its validity and reliability in advance.

Sampling procedures

As well as choosing an appropriate sampling method, you need a concrete plan for how you’ll actually contact and recruit your selected sample.

That means making decisions about things like:

  • How many participants do you need for an adequate sample size?
  • What inclusion and exclusion criteria will you use to identify eligible participants?
  • How will you contact your sample – by mail, online, by phone, or in person?

If you’re using a probability sampling method, it’s important that everyone who is randomly selected actually participates in the study. How will you ensure a high response rate?

If you’re using a non-probability method, how will you avoid bias and ensure a representative sample?

Data management

It’s also important to create a data management plan for organising and storing your data.

Will you need to transcribe interviews or perform data entry for observations? You should anonymise and safeguard any sensitive data, and make sure it’s backed up regularly.

Keeping your data well organised will save time when it comes to analysing them. It can also help other researchers validate and add to your findings.

On their own, raw data can’t answer your research question. The last step of designing your research is planning how you’ll analyse the data.

Quantitative data analysis

In quantitative research, you’ll most likely use some form of statistical analysis . With statistics, you can summarise your sample data, make estimates, and test hypotheses.

Using descriptive statistics , you can summarise your sample data in terms of:

  • The distribution of the data (e.g., the frequency of each score on a test)
  • The central tendency of the data (e.g., the mean to describe the average score)
  • The variability of the data (e.g., the standard deviation to describe how spread out the scores are)

The specific calculations you can do depend on the level of measurement of your variables.

Using inferential statistics , you can:

  • Make estimates about the population based on your sample data.
  • Test hypotheses about a relationship between variables.

Regression and correlation tests look for associations between two or more variables, while comparison tests (such as t tests and ANOVAs ) look for differences in the outcomes of different groups.

Your choice of statistical test depends on various aspects of your research design, including the types of variables you’re dealing with and the distribution of your data.

Qualitative data analysis

In qualitative research, your data will usually be very dense with information and ideas. Instead of summing it up in numbers, you’ll need to comb through the data in detail, interpret its meanings, identify patterns, and extract the parts that are most relevant to your research question.

Two of the most common approaches to doing this are thematic analysis and discourse analysis .

There are many other ways of analysing qualitative data depending on the aims of your research. To get a sense of potential approaches, try reading some qualitative research papers in your field.

A sample is a subset of individuals from a larger population. Sampling means selecting the group that you will actually collect data from in your research.

For example, if you are researching the opinions of students in your university, you could survey a sample of 100 students.

Statistical sampling allows you to test a hypothesis about the characteristics of a population. There are various sampling methods you can use to ensure that your sample is representative of the population as a whole.

Operationalisation means turning abstract conceptual ideas into measurable observations.

For example, the concept of social anxiety isn’t directly observable, but it can be operationally defined in terms of self-rating scores, behavioural avoidance of crowded places, or physical anxiety symptoms in social situations.

Before collecting data , it’s important to consider how you will operationalise the variables that you want to measure.

The research methods you use depend on the type of data you need to answer your research question .

  • If you want to measure something or test a hypothesis , use quantitative methods . If you want to explore ideas, thoughts, and meanings, use qualitative methods .
  • If you want to analyse a large amount of readily available data, use secondary data. If you want data specific to your purposes with control over how they are generated, collect primary data.
  • If you want to establish cause-and-effect relationships between variables , use experimental methods. If you want to understand the characteristics of a research subject, use descriptive methods.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the ‘Cite this Scribbr article’ button to automatically add the citation to our free Reference Generator.

McCombes, S. (2023, March 20). Research Design | Step-by-Step Guide with Examples. Scribbr. Retrieved 3 June 2024, from https://www.scribbr.co.uk/research-methods/research-design/

Is this article helpful?

Shona McCombes

Shona McCombes

Leeds Beckett University

Skills for Learning : Research Skills

Data analysis is an ongoing process that should occur throughout your research project. Suitable data-analysis methods must be selected when you write your research proposal. The nature of your data (i.e. quantitative or qualitative) will be influenced by your research design and purpose. The data will also influence the analysis methods selected.

We run interactive workshops to help you develop skills related to doing research, such as data analysis, writing literature reviews and preparing for dissertations. Find out more on the Skills for Learning Workshops page.

We have online academic skills modules within MyBeckett for all levels of university study. These modules will help your academic development and support your success at LBU. You can work through the modules at your own pace, revisiting them as required. Find out more from our FAQ What academic skills modules are available?

Quantitative data analysis

Broadly speaking, 'statistics' refers to methods, tools and techniques used to collect, organise and interpret data. The goal of statistics is to gain understanding from data. Therefore, you need to know how to:

  • Produce data – for example, by handing out a questionnaire or doing an experiment.
  • Organise, summarise, present and analyse data.
  • Draw valid conclusions from findings.

There are a number of statistical methods you can use to analyse data. Choosing an appropriate statistical method should follow naturally, however, from your research design. Therefore, you should think about data analysis at the early stages of your study design. You may need to consult a statistician for help with this.

Tips for working with statistical data

  • Plan so that the data you get has a good chance of successfully tackling the research problem. This will involve reading literature on your subject, as well as on what makes a good study.
  • To reach useful conclusions, you need to reduce uncertainties or 'noise'. Thus, you will need a sufficiently large data sample. A large sample will improve precision. However, this must be balanced against the 'costs' (time and money) of collection.
  • Consider the logistics. Will there be problems in obtaining sufficient high-quality data? Think about accuracy, trustworthiness and completeness.
  • Statistics are based on random samples. Consider whether your sample will be suited to this sort of analysis. Might there be biases to think about?
  • How will you deal with missing values (any data that is not recorded for some reason)? These can result from gaps in a record or whole records being missed out.
  • When analysing data, start by looking at each variable separately. Conduct initial/exploratory data analysis using graphical displays. Do this before looking at variables in conjunction or anything more complicated. This process can help locate errors in the data and also gives you a 'feel' for the data.
  • Look out for patterns of 'missingness'. They are likely to alert you if there’s a problem. If the 'missingness' is not random, then it will have an impact on the results.
  • Be vigilant and think through what you are doing at all times. Think critically. Statistics are not just mathematical tricks that a computer sorts out. Rather, analysing statistical data is a process that the human mind must interpret!

Top tips! Try inventing or generating the sort of data you might get and see if you can analyse it. Make sure that your process works before gathering actual data. Think what the output of an analytic procedure will look like before doing it for real.

(Note: it is actually difficult to generate realistic data. There are fraud-detection methods in place to identify data that has been fabricated. So, remember to get rid of your practice data before analysing the real stuff!)

Statistical software packages

Software packages can be used to analyse and present data. The most widely used ones are SPSS and NVivo.

SPSS is a statistical-analysis and data-management package for quantitative data analysis. Click on ‘ How do I install SPSS? ’ to learn how to download SPSS to your personal device. SPSS can perform a wide variety of statistical procedures. Some examples are:

  • Data management (i.e. creating subsets of data or transforming data).
  • Summarising, describing or presenting data (i.e. mean, median and frequency).
  • Looking at the distribution of data (i.e. standard deviation).
  • Comparing groups for significant differences using parametric (i.e. t-test) and non-parametric (i.e. Chi-square) tests.
  • Identifying significant relationships between variables (i.e. correlation).

NVivo can be used for qualitative data analysis. It is suitable for use with a wide range of methodologies. Click on ‘ How do I access NVivo ’ to learn how to download NVivo to your personal device. NVivo supports grounded theory, survey data, case studies, focus groups, phenomenology, field research and action research.

  • Process data such as interview transcripts, literature or media extracts, and historical documents.
  • Code data on screen and explore all coding and documents interactively.
  • Rearrange, restructure, extend and edit text, coding and coding relationships.
  • Search imported text for words, phrases or patterns, and automatically code the results.

Qualitative data analysis

Miles and Huberman (1994) point out that there are diverse approaches to qualitative research and analysis. They suggest, however, that it is possible to identify 'a fairly classic set of analytic moves arranged in sequence'. This involves:

  • Affixing codes to a set of field notes drawn from observation or interviews.
  • Noting reflections or other remarks in the margins.
  • Sorting/sifting through these materials to identify: a) similar phrases, relationships between variables, patterns and themes and b) distinct differences between subgroups and common sequences.
  • Isolating these patterns/processes and commonalties/differences. Then, taking them out to the field in the next wave of data collection.
  • Highlighting generalisations and relating them to your original research themes.
  • Taking the generalisations and analysing them in relation to theoretical perspectives.

        (Miles and Huberman, 1994.)

Patterns and generalisations are usually arrived at through a process of analytic induction (see above points 5 and 6). Qualitative analysis rarely involves statistical analysis of relationships between variables. Qualitative analysis aims to gain in-depth understanding of concepts, opinions or experiences.

Presenting information

There are a number of different ways of presenting and communicating information. The particular format you use is dependent upon the type of data generated from the methods you have employed.

Here are some appropriate ways of presenting information for different types of data:

Bar charts: These   may be useful for comparing relative sizes. However, they tend to use a large amount of ink to display a relatively small amount of information. Consider a simple line chart as an alternative.

Pie charts: These have the benefit of indicating that the data must add up to 100%. However, they make it difficult for viewers to distinguish relative sizes, especially if two slices have a difference of less than 10%.

Other examples of presenting data in graphical form include line charts and  scatter plots .

Qualitative data is more likely to be presented in text form. For example, using quotations from interviews or field diaries.

  • Plan ahead, thinking carefully about how you will analyse and present your data.
  • Think through possible restrictions to resources you may encounter and plan accordingly.
  • Find out about the different IT packages available for analysing your data and select the most appropriate.
  • If necessary, allow time to attend an introductory course on a particular computer package. You can book SPSS and NVivo workshops via MyHub .
  • Code your data appropriately, assigning conceptual or numerical codes as suitable.
  • Organise your data so it can be analysed and presented easily.
  • Choose the most suitable way of presenting your information, according to the type of data collected. This will allow your information to be understood and interpreted better.

Primary, secondary and tertiary sources

Information sources are sometimes categorised as primary, secondary or tertiary sources depending on whether or not they are ‘original’ materials or data. For some research projects, you may need to use primary sources as well as secondary or tertiary sources. However the distinction between primary and secondary sources is not always clear and depends on the context. For example, a newspaper article might usually be categorised as a secondary source. But it could also be regarded as a primary source if it were an article giving a first-hand account of a historical event written close to the time it occurred.

  • Primary sources
  • Secondary sources
  • Tertiary sources
  • Grey literature

Primary sources are original sources of information that provide first-hand accounts of what is being experienced or researched. They enable you to get as close to the actual event or research as possible. They are useful for getting the most contemporary information about a topic.

Examples include diary entries, newspaper articles, census data, journal articles with original reports of research, letters, email or other correspondence, original manuscripts and archives, interviews, research data and reports, statistics, autobiographies, exhibitions, films, and artists' writings.

Some information will be available on an Open Access basis, freely accessible online. However, many academic sources are paywalled, and you may need to login as a Leeds Beckett student to access them. Where Leeds Beckett does not have access to a source, you can use our  Request It! Service .

Secondary sources interpret, evaluate or analyse primary sources. They're useful for providing background information on a topic, or for looking back at an event from a current perspective. The majority of your literature searching will probably be done to find secondary sources on your topic.

Examples include journal articles which review or interpret original findings, popular magazine articles commenting on more serious research, textbooks and biographies.

The term tertiary sources isn't used a great deal. There's overlap between what might be considered a secondary source and a tertiary source. One definition is that a tertiary source brings together secondary sources.

Examples include almanacs, fact books, bibliographies, dictionaries and encyclopaedias, directories, indexes and abstracts. They can be useful for introductory information or an overview of a topic in the early stages of research.

Depending on your subject of study, grey literature may be another source you need to use. Grey literature includes technical or research reports, theses and dissertations, conference papers, government documents, white papers, and so on.

Artificial intelligence tools

Before using any generative artificial intelligence or paraphrasing tools in your assessments, you should check if this is permitted on your course.

If their use is permitted on your course, you must  acknowledge any use of generative artificial intelligence tools  such as ChatGPT or paraphrasing tools (e.g., Grammarly, Quillbot, etc.), even if you have only used them to generate ideas for your assessments or for proofreading.

  • Academic Integrity Module in MyBeckett
  • Assignment Calculator
  • Building on Feedback
  • Disability Advice
  • Essay X-ray tool
  • International Students' Academic Introduction
  • Manchester Academic Phrasebank
  • Quote, Unquote
  • Skills and Subject Suppor t
  • Turnitin Grammar Checker

{{You can add more boxes below for links specific to this page [this note will not appear on user pages] }}

  • Research Methods Checklist
  • Sampling Checklist

Skills for Learning FAQs

Library & Student Services

0113 812 1000

  • University Disclaimer
  • Accessibility

research and analysis project

What is Project Analysis and Why it is Important?

By Aastha Shaw Jan 25, 2022

research and analysis project

What do you do to check if a project is going on track? 

You analyze it!

Consistent project analysis helps you make the right choices at the right time, leading you towards a more successful outcome and the highest possible ROI. 

Here we will talk about project analysis, its importance, the different types of project analysis, and lastly, how you can implement it using the right tools.

What is Project Analysis?

When executing a project , you need to analyze it periodically. Failing to do so, would mean unexpected challenges, overlooked critical information, or flaws in the work process that manifest as the project unfolds.

This is why you need project analysis.

It is assessing every expense or problem related to a project before working on it and evaluating the outcome once the work is done.

Analyze your projects with SmartTask

Importance of project analysis.

A study conducted last year stated that over two-thirds of all projects were not completed on time and went over the budget.  

What separates the failed two-thirds of the projects from the successful one-third? 

Regular analysis.

Project analysis lets you see the present problems and prepare for and avoid future problems. This ensures smooth project execution and timely project delivery.

For efficient project analysis, it is equally important to have the right tool that will help you monitor and analyze your project from its initiation to completion.

Types of project analysis:

A. ongoing project risk analysis.

What happens if your key teammate gets injured during a project and needs to take a month off? Or your equipment malfunctions which stop the work, but the labor charges keep adding up? Or a natural calamity takes place?

All of these can affect the project timeline and cost.

Risk analysis ensures that the least number of surprises occur during your project. It helps you predict the uncertainties in the projects and minimize the occurrence and impact of these uncertainties.

How to do project risk analysis?

1. Define Critical Path:

Each project consists of dependent tasks that rely on one or more tasks to be performed in a particular order for their completion. 

This is where understanding the longest chain of dependencies or the project's critical path becomes very important. Any delay in the critical path would ultimately lead to missed deadlines. 

You can use project management software that helps you map your project plan and highlight its critical path. (In the image below, you would see the critical path highlighted in red)

Project critical path

2. Streamline communication channels:

You don't want to be spending most of your time coordinating with the management, sales team, clients, and vendors. So, it's essential to keep communication and information flowing in one tool. 

Establishing a good communication channel instead of checking on everyone for updates will help you better track task progress and save time. With a proper communication channel in place, task assignees can keep you updated in real-time and let you know about any delays, problems, or requirements easily.

Thus, it is critical to have a project management tool that allows real-time collaboration & communication via Chat, Task Comments, and Video conferencing.

3. Regularly monitor risks:

Once you have defined the project's critical path and streamlined the communication channel, you need to focus on how each task is panning out. 

You would not want to be unaware of situations such as delays in the shipment of raw material on which a lot of your project tasks depend. 

This is why we recommend tracking and analyzing the project's progress every week. (In SmartTask, there's a News Feed view that would allow you to share the project status with the team and for team members to share their thoughts on the status update.)

null

Say today is Saturday, and you are going through the last week's progress. You identified that one procurement item might become a bottleneck and affect the critical path. 

This will help you immediately take necessary actions and save your project from facing major drawbacks.

Monitor your projects with SmartTask

Free for Unlimited Users, Tasks, & Projects

4. Determine their impact on the project:

Every risk has an impact, some more than others. 

You can evaluate the impact of the risks by looking at:

  • How much will the task be delayed?
  • Would it affect the critical path? 

If the delay does affect the critical path, can we have the procurement team expedite the delivery? Or is it a lost cause?

5. Prepare a contingency plan to treat those risks:

A contingency plan helps you to be prepared for future challenges and reduce the biggest risks to manageable levels. It is a course of action that enables you to deal with a situation that might happen. 

In the above example, we tried to expedite the raw material delivery; however, it is clear now that the item won't be delivered on time.

Now, as a project manager, it's time to re-evaluate the project plan and see if you can save the project from overshooting the timeline. 

There are two ways to deal with this:

  • Allocating additional resources to tasks dependent on raw material shipments and attempting to complete tasks ahead of schedule. - This would speed up your work and allow you to get back on track.
  • Executing dependent processes in parallel.  - Some tasks can be executed parallelly, but we avoid doing this to be safe and complete the tasks with perfection. - But with good planning and required resources, you can execute dependent tasks parallelly to complete the project on time and with the same quality.

Your decision would be based on the trade-offs you are ready to make. So, it's wise to your team's and management's feedback to get better insights and make the best decisions.

6. Regularly update the team on the project’s progress:

It’s important to keep everyone updated on the project progress and all the important decisions taken. With SmartTask’s News Feed feature you can update your teammates and project stakeholders about everything related to your project in real-time. 

Project Status

On the feed update itself, they can share their feedback, mention others, and share important files.

With SmartTask, you can record all your project history in one place and easily access them when needed. It also helps you track your projects to identify any potential problems and bottlenecks so that you can deal with them on time.

B. Project Cost Analysis

Suppose you land a software project. You roughly forecast the project timeline, resources, budget, etc., and reach a tentative project cost estimate at the project proposal time.

But moving forward without a thorough project cost analysis leads to budget overruns, missed deadlines, and a miserably failed project.

Therefore it's critical to conduct a proper project cost analysis to develop a strategic plan to avoid repetitive cost overruns and save your project from sinking.

How to do project cost analysis?

1. Determine the project goal:

Before you start working on project costing, your team needs to have a clear idea of the final goal and requirements of the project. 

This is where a clearly defined client requirement document helps. 

The client's requirement document will help your team divide the project into milestones and adequately define all the resources needed for timely project delivery.

2. Draw up a project plan:

Once the milestones are set, utilize project management software to map out tasks within those milestones with their delivery timelines and expectations. 

You can use task dependencies to better map out the project plan and provide that clear understanding of the critical path to your team. 

Lastly, assign each task to a suitable person. A good collaboration software would notify each team member when a new task is assigned. 

3. Set Time Estimates: 

One of the most crucial cost factors of any project is the resources needed to accomplish this project.

Make sure that the time to be spent on a task is set as a Time estimate. (Note: Time estimates or efforts needed to accomplish the task and task timeline are independent of each other.)

4. Define Resource Cost:

A good resource management software allows you to define the cost per hour and Billing rate for each resource. 

So, along with the fixed cost, assignee, time estimates, and efforts needed to accomplish each task in the project, you also get a clear idea of the cost per resource. And in turn, you can set and track the total cost of the project better.  

5. Set a Factor of Safety:

While you understand the project's total cost, it's essential to consider contingency funds if things don't go as planned.

Government compliances, taxes, client and vendor delays may delay the project and increase the cost of delivery. 

It's prudent to consider all these unknown factors and add them to the project cost as a Factor of Safety. 

While each project is unique and Factor of Safety(FOS) percentages can be different, we recommend not having less than 20% as FOS.

Project Time Tracking Summary

Track your project's cost with ease

Try SmartTask - Free for Unlimited Users

C. Workload Analysis

Suppose your team has two software engineers who are working on a project. You divide the tasks equally amongst them. 

After they start working, you see that while Engineer A has completed his tasks, Engineer B has not started his work.

But when dwelled deeper, it is seen that Engineer A didn't have much on his plate, whereas Engineer B was overloaded with tasks from more than 3 projects, causing the delay.

To avoid such situations, you need a Workload analysis tool that would forecast each team member's workload, whether they are overloaded or under-utilized. 

How to do workload analysis?

1. Define effort required in each project:

As noted in the "Project cost analysis" section, define the time effort required to accomplish each task in the project.

Depending on the task's timeline and the effort required, estimate each project member's number of hours blocked.

2. Group projects to understand the holistic picture:

Since a team member may be involved in multiple projects , it's important to group all these projects. This will give you a better understanding of their total effort and responsibilities across different projects. 

Here's what the workload would look like. As you can see, the red color indicates overloading.

A workload view helps you track what your team members are working on what and also enables you to reassign the task from one member to another if required.

3. Balance workload among the team :

If your team is continuously overburdened and stressed out due to extra work, it might result in a breakdown soon. Thus it's important to distribute the workload among your team members evenly.  

Once you identify resource overloading, here's how you can handle it:

  • Unloading the task - By re-allocating the task from one person to another.
  • Extending the task's timeline- To lower the effort required per day.

Want to save your team from burnout?

D. process analysis.

Sometimes a particular process can be complicated and unnecessarily lengthy, leading to poor project performance.

Process analysis helps you analyze and improve the process with inefficiencies that can affect your bottom line.

How to do process analysis?

1. Define what you want to analyze:

Say, if you are running the same type of project, again and again, it is normal that you might fix a template and follow the same process for similar future projects.

This often leaves out the opportunity to dwell and improve on the process as you keep following the same template again and again.

You need to quit this cycle and identify where your processes need revamping.

2. Collect all information about the process:

Collect as much information from all the past projects as possible on the selected process.. 

Identify the stakeholders and all the people involved in the process.

Gather data on how they tackle the process, what they do, when they do it, how often, what tools they use, what procedures they currently follow, and more. 

3. Analyze the process:

Now looking at the collected information and the existing process template, perform a thorough analysis to figure out problems like;

  • What are the most critical aspects of the project?
  • What are the most time-consuming aspects?
  • Is there anything causing delays?
  • Which tasks get delayed regularly?
  • Is it possible to speed up the process?
  • Are there any steps that can be automated or eliminated?
  • What are the most prevalent complaints from people involved in the process?
  • In which areas are human errors most likely to occur?

This is where having project management software with portfolio view can make your task way easier.

With a portfolio, you can group all the similar projects together and get all the insights like due dates, delays, assignees, progress, and more at once.  

Don’t forget to get feedback from important stakeholders and people involved to ensure that you are not missing out on anything. 

4. Make changes for improvement:

It’s time to envision how the process can be improved in future and eventually change the project template to reflect those improvements. 

These improvements can be addition or removal of tasks, automating repetitive taks, change in timelines, or effort required for each task, change in team, and so on...

Suppose your goal is to shorten the process cycle. Then you need to develop solutions like automating your process where possible and reduce manual labor to save time.

Also, make sure to convey any changes to the stakeholders involved and monitor the process regularly.

null

Keep in mind that business process analysis is a continuous process. You must examine the processes regularly and improve them to keep the process error-free.

Here’s a detailed video on how to do process analysis:

To conclude…

Project analysis is critical for companies and project managers to make their projects more successful and sustainable. 

While it’s evident that problems and challenges will come your way, you can keep things under control with the right tool and approach.

There are many project management tools in the market, but SmartTask checks all the boxes for analyzing and managing your projects efficiently. 

So be smart and get SmartTask to make assessing your projects easier and deliver better. 

Want help with Project Analysis? Book a Free Consultation

research and analysis project

All in One - Work Management Tool

research and analysis project

Free Forever

research and analysis project

SmartTask is the best online collaboration tool to manage your team's progress.

  • Task Management
  • Project Management
  • Integrations
  • Asana Alternative
  • Trello Alternative
  • Clickup Alternative
  • Monday.com Alternative
  • Smartsheet Alternative
  • Basecamp Alternative
  • Wrike Alternative
  • Plutio Alternative
  • NiftyPM Alternative
  • Become an Affiliate
  • Privacy Policy
  • Terms of Service
  • GDPR Compliant

research and analysis project

  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • QuestionPro

survey software icon

  • Solutions Industries Gaming Automotive Sports and events Education Government Travel & Hospitality Financial Services Healthcare Cannabis Technology Use Case NPS+ Communities Audience Contactless surveys Mobile LivePolls Member Experience GDPR Positive People Science 360 Feedback Surveys
  • Resources Blog eBooks Survey Templates Case Studies Training Help center

research and analysis project

Home Market Research

Data Analysis in Research: Types & Methods

data-analysis-in-research

Content Index

Why analyze data in research?

Types of data in research, finding patterns in the qualitative data, methods used for data analysis in qualitative research, preparing data for analysis, methods used for data analysis in quantitative research, considerations in research data analysis, what is data analysis in research.

Definition of research in data analysis: According to LeCompte and Schensul, research data analysis is a process used by researchers to reduce data to a story and interpret it to derive insights. The data analysis process helps reduce a large chunk of data into smaller fragments, which makes sense. 

Three essential things occur during the data analysis process — the first is data organization . Summarization and categorization together contribute to becoming the second known method used for data reduction. It helps find patterns and themes in the data for easy identification and linking. The third and last way is data analysis – researchers do it in both top-down and bottom-up fashion.

LEARN ABOUT: Research Process Steps

On the other hand, Marshall and Rossman describe data analysis as a messy, ambiguous, and time-consuming but creative and fascinating process through which a mass of collected data is brought to order, structure and meaning.

We can say that “the data analysis and data interpretation is a process representing the application of deductive and inductive logic to the research and data analysis.”

Researchers rely heavily on data as they have a story to tell or research problems to solve. It starts with a question, and data is nothing but an answer to that question. But, what if there is no question to ask? Well! It is possible to explore data even without a problem – we call it ‘Data Mining’, which often reveals some interesting patterns within the data that are worth exploring.

Irrelevant to the type of data researchers explore, their mission and audiences’ vision guide them to find the patterns to shape the story they want to tell. One of the essential things expected from researchers while analyzing data is to stay open and remain unbiased toward unexpected patterns, expressions, and results. Remember, sometimes, data analysis tells the most unforeseen yet exciting stories that were not expected when initiating data analysis. Therefore, rely on the data you have at hand and enjoy the journey of exploratory research. 

Create a Free Account

Every kind of data has a rare quality of describing things after assigning a specific value to it. For analysis, you need to organize these values, processed and presented in a given context, to make it useful. Data can be in different forms; here are the primary data types.

  • Qualitative data: When the data presented has words and descriptions, then we call it qualitative data . Although you can observe this data, it is subjective and harder to analyze data in research, especially for comparison. Example: Quality data represents everything describing taste, experience, texture, or an opinion that is considered quality data. This type of data is usually collected through focus groups, personal qualitative interviews , qualitative observation or using open-ended questions in surveys.
  • Quantitative data: Any data expressed in numbers of numerical figures are called quantitative data . This type of data can be distinguished into categories, grouped, measured, calculated, or ranked. Example: questions such as age, rank, cost, length, weight, scores, etc. everything comes under this type of data. You can present such data in graphical format, charts, or apply statistical analysis methods to this data. The (Outcomes Measurement Systems) OMS questionnaires in surveys are a significant source of collecting numeric data.
  • Categorical data: It is data presented in groups. However, an item included in the categorical data cannot belong to more than one group. Example: A person responding to a survey by telling his living style, marital status, smoking habit, or drinking habit comes under the categorical data. A chi-square test is a standard method used to analyze this data.

Learn More : Examples of Qualitative Data in Education

Data analysis in qualitative research

Data analysis and qualitative data research work a little differently from the numerical data as the quality data is made up of words, descriptions, images, objects, and sometimes symbols. Getting insight from such complicated information is a complicated process. Hence it is typically used for exploratory research and data analysis .

Although there are several ways to find patterns in the textual information, a word-based method is the most relied and widely used global technique for research and data analysis. Notably, the data analysis process in qualitative research is manual. Here the researchers usually read the available data and find repetitive or commonly used words. 

For example, while studying data collected from African countries to understand the most pressing issues people face, researchers might find  “food”  and  “hunger” are the most commonly used words and will highlight them for further analysis.

LEARN ABOUT: Level of Analysis

The keyword context is another widely used word-based technique. In this method, the researcher tries to understand the concept by analyzing the context in which the participants use a particular keyword.  

For example , researchers conducting research and data analysis for studying the concept of ‘diabetes’ amongst respondents might analyze the context of when and how the respondent has used or referred to the word ‘diabetes.’

The scrutiny-based technique is also one of the highly recommended  text analysis  methods used to identify a quality data pattern. Compare and contrast is the widely used method under this technique to differentiate how a specific text is similar or different from each other. 

For example: To find out the “importance of resident doctor in a company,” the collected data is divided into people who think it is necessary to hire a resident doctor and those who think it is unnecessary. Compare and contrast is the best method that can be used to analyze the polls having single-answer questions types .

Metaphors can be used to reduce the data pile and find patterns in it so that it becomes easier to connect data with theory.

Variable Partitioning is another technique used to split variables so that researchers can find more coherent descriptions and explanations from the enormous data.

LEARN ABOUT: Qualitative Research Questions and Questionnaires

There are several techniques to analyze the data in qualitative research, but here are some commonly used methods,

  • Content Analysis:  It is widely accepted and the most frequently employed technique for data analysis in research methodology. It can be used to analyze the documented information from text, images, and sometimes from the physical items. It depends on the research questions to predict when and where to use this method.
  • Narrative Analysis: This method is used to analyze content gathered from various sources such as personal interviews, field observation, and  surveys . The majority of times, stories, or opinions shared by people are focused on finding answers to the research questions.
  • Discourse Analysis:  Similar to narrative analysis, discourse analysis is used to analyze the interactions with people. Nevertheless, this particular method considers the social context under which or within which the communication between the researcher and respondent takes place. In addition to that, discourse analysis also focuses on the lifestyle and day-to-day environment while deriving any conclusion.
  • Grounded Theory:  When you want to explain why a particular phenomenon happened, then using grounded theory for analyzing quality data is the best resort. Grounded theory is applied to study data about the host of similar cases occurring in different settings. When researchers are using this method, they might alter explanations or produce new ones until they arrive at some conclusion.

LEARN ABOUT: 12 Best Tools for Researchers

Data analysis in quantitative research

The first stage in research and data analysis is to make it for the analysis so that the nominal data can be converted into something meaningful. Data preparation consists of the below phases.

Phase I: Data Validation

Data validation is done to understand if the collected data sample is per the pre-set standards, or it is a biased data sample again divided into four different stages

  • Fraud: To ensure an actual human being records each response to the survey or the questionnaire
  • Screening: To make sure each participant or respondent is selected or chosen in compliance with the research criteria
  • Procedure: To ensure ethical standards were maintained while collecting the data sample
  • Completeness: To ensure that the respondent has answered all the questions in an online survey. Else, the interviewer had asked all the questions devised in the questionnaire.

Phase II: Data Editing

More often, an extensive research data sample comes loaded with errors. Respondents sometimes fill in some fields incorrectly or sometimes skip them accidentally. Data editing is a process wherein the researchers have to confirm that the provided data is free of such errors. They need to conduct necessary checks and outlier checks to edit the raw edit and make it ready for analysis.

Phase III: Data Coding

Out of all three, this is the most critical phase of data preparation associated with grouping and assigning values to the survey responses . If a survey is completed with a 1000 sample size, the researcher will create an age bracket to distinguish the respondents based on their age. Thus, it becomes easier to analyze small data buckets rather than deal with the massive data pile.

LEARN ABOUT: Steps in Qualitative Research

After the data is prepared for analysis, researchers are open to using different research and data analysis methods to derive meaningful insights. For sure, statistical analysis plans are the most favored to analyze numerical data. In statistical analysis, distinguishing between categorical data and numerical data is essential, as categorical data involves distinct categories or labels, while numerical data consists of measurable quantities. The method is again classified into two groups. First, ‘Descriptive Statistics’ used to describe data. Second, ‘Inferential statistics’ that helps in comparing the data .

Descriptive statistics

This method is used to describe the basic features of versatile types of data in research. It presents the data in such a meaningful way that pattern in the data starts making sense. Nevertheless, the descriptive analysis does not go beyond making conclusions. The conclusions are again based on the hypothesis researchers have formulated so far. Here are a few major types of descriptive analysis methods.

Measures of Frequency

  • Count, Percent, Frequency
  • It is used to denote home often a particular event occurs.
  • Researchers use it when they want to showcase how often a response is given.

Measures of Central Tendency

  • Mean, Median, Mode
  • The method is widely used to demonstrate distribution by various points.
  • Researchers use this method when they want to showcase the most commonly or averagely indicated response.

Measures of Dispersion or Variation

  • Range, Variance, Standard deviation
  • Here the field equals high/low points.
  • Variance standard deviation = difference between the observed score and mean
  • It is used to identify the spread of scores by stating intervals.
  • Researchers use this method to showcase data spread out. It helps them identify the depth until which the data is spread out that it directly affects the mean.

Measures of Position

  • Percentile ranks, Quartile ranks
  • It relies on standardized scores helping researchers to identify the relationship between different scores.
  • It is often used when researchers want to compare scores with the average count.

For quantitative research use of descriptive analysis often give absolute numbers, but the in-depth analysis is never sufficient to demonstrate the rationale behind those numbers. Nevertheless, it is necessary to think of the best method for research and data analysis suiting your survey questionnaire and what story researchers want to tell. For example, the mean is the best way to demonstrate the students’ average scores in schools. It is better to rely on the descriptive statistics when the researchers intend to keep the research or outcome limited to the provided  sample  without generalizing it. For example, when you want to compare average voting done in two different cities, differential statistics are enough.

Descriptive analysis is also called a ‘univariate analysis’ since it is commonly used to analyze a single variable.

Inferential statistics

Inferential statistics are used to make predictions about a larger population after research and data analysis of the representing population’s collected sample. For example, you can ask some odd 100 audiences at a movie theater if they like the movie they are watching. Researchers then use inferential statistics on the collected  sample  to reason that about 80-90% of people like the movie. 

Here are two significant areas of inferential statistics.

  • Estimating parameters: It takes statistics from the sample research data and demonstrates something about the population parameter.
  • Hypothesis test: I t’s about sampling research data to answer the survey research questions. For example, researchers might be interested to understand if the new shade of lipstick recently launched is good or not, or if the multivitamin capsules help children to perform better at games.

These are sophisticated analysis methods used to showcase the relationship between different variables instead of describing a single variable. It is often used when researchers want something beyond absolute numbers to understand the relationship between variables.

Here are some of the commonly used methods for data analysis in research.

  • Correlation: When researchers are not conducting experimental research or quasi-experimental research wherein the researchers are interested to understand the relationship between two or more variables, they opt for correlational research methods.
  • Cross-tabulation: Also called contingency tables,  cross-tabulation  is used to analyze the relationship between multiple variables.  Suppose provided data has age and gender categories presented in rows and columns. A two-dimensional cross-tabulation helps for seamless data analysis and research by showing the number of males and females in each age category.
  • Regression analysis: For understanding the strong relationship between two variables, researchers do not look beyond the primary and commonly used regression analysis method, which is also a type of predictive analysis used. In this method, you have an essential factor called the dependent variable. You also have multiple independent variables in regression analysis. You undertake efforts to find out the impact of independent variables on the dependent variable. The values of both independent and dependent variables are assumed as being ascertained in an error-free random manner.
  • Frequency tables: The statistical procedure is used for testing the degree to which two or more vary or differ in an experiment. A considerable degree of variation means research findings were significant. In many contexts, ANOVA testing and variance analysis are similar.
  • Analysis of variance: The statistical procedure is used for testing the degree to which two or more vary or differ in an experiment. A considerable degree of variation means research findings were significant. In many contexts, ANOVA testing and variance analysis are similar.
  • Researchers must have the necessary research skills to analyze and manipulation the data , Getting trained to demonstrate a high standard of research practice. Ideally, researchers must possess more than a basic understanding of the rationale of selecting one statistical method over the other to obtain better data insights.
  • Usually, research and data analytics projects differ by scientific discipline; therefore, getting statistical advice at the beginning of analysis helps design a survey questionnaire, select data collection methods , and choose samples.

LEARN ABOUT: Best Data Collection Tools

  • The primary aim of data research and analysis is to derive ultimate insights that are unbiased. Any mistake in or keeping a biased mind to collect data, selecting an analysis method, or choosing  audience  sample il to draw a biased inference.
  • Irrelevant to the sophistication used in research data and analysis is enough to rectify the poorly defined objective outcome measurements. It does not matter if the design is at fault or intentions are not clear, but lack of clarity might mislead readers, so avoid the practice.
  • The motive behind data analysis in research is to present accurate and reliable data. As far as possible, avoid statistical errors, and find a way to deal with everyday challenges like outliers, missing data, data altering, data mining , or developing graphical representation.

LEARN MORE: Descriptive Research vs Correlational Research The sheer amount of data generated daily is frightening. Especially when data analysis has taken center stage. in 2018. In last year, the total data supply amounted to 2.8 trillion gigabytes. Hence, it is clear that the enterprises willing to survive in the hypercompetitive world must possess an excellent capability to analyze complex research data, derive actionable insights, and adapt to the new market needs.

LEARN ABOUT: Average Order Value

QuestionPro is an online survey platform that empowers organizations in data analysis and research and provides them a medium to collect data by creating appealing surveys.

MORE LIKE THIS

We are on the front end of an innovation that can help us better predict how to transform our customer interactions.

How Can I Help You? — Tuesday CX Thoughts

Jun 5, 2024

research and analysis project

Why Multilingual 360 Feedback Surveys Provide Better Insights

Jun 3, 2024

Raked Weighting

Raked Weighting: A Key Tool for Accurate Survey Results

May 31, 2024

Data trends

Top 8 Data Trends to Understand the Future of Data

May 30, 2024

Other categories

  • Academic Research
  • Artificial Intelligence
  • Assessments
  • Brand Awareness
  • Case Studies
  • Communities
  • Consumer Insights
  • Customer effort score
  • Customer Engagement
  • Customer Experience
  • Customer Loyalty
  • Customer Research
  • Customer Satisfaction
  • Employee Benefits
  • Employee Engagement
  • Employee Retention
  • Friday Five
  • General Data Protection Regulation
  • Insights Hub
  • Life@QuestionPro
  • Market Research
  • Mobile diaries
  • Mobile Surveys
  • New Features
  • Online Communities
  • Question Types
  • Questionnaire
  • QuestionPro Products
  • Release Notes
  • Research Tools and Apps
  • Revenue at Risk
  • Survey Templates
  • Training Tips
  • Uncategorized
  • Video Learning Series
  • What’s Coming Up
  • Workforce Intelligence
  • Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar

Free ACCA & CIMA online courses from OpenTuition

Free ACCA & CIMA online courses from OpenTuition

Free Notes, Lectures, Tests and Forums for ACCA and CIMA exams

20% off ACCA & CIMA Books

OpenTuition recommends the new interactive BPP books for June 2024 exams. Get your discount code >>

June 2024 ACCA Exams - Comments and Instant Poll >>

The research and analysis project (rap), research report topics.

This is a formal 7,500 word report on one of 20 set subjects selected from the following list:

  • An analysis and evaluation of an organisation’s budgetary control system and its links with performance management and decision making
  • An evaluation of how the introduction of a new technology can assist an organisation in achieving its business objectives
  • An assessment of the potential impact of an aspect of impending legislation on the operations and financial position of an organisation
  • Analyse and evaluate the business and financial performance of an organisation which has performed exceptionally poorly over a three-year period with a critical analysis of the reasons for its difficulties.
  • Analyse and evaluate the business and financial performance of an organisation which has performed exceptionally well over a three-year period with a critical analysis of the reasons for its success.
  • A critical review of key factors or indicators in the motivation of employees in an organisation
  • A critical evaluation of the restructuring of an organisation’s operational activities and the effect on the organisation’s financial performance
  • Analyse and evaluate the business and financial performance over a three year period of an organisation operating in a sector that has faced strategic and operational challenges with an emphasis on how management have addressed these challenges.
  • A critical evaluation of the planning and implementation of an information system in an organisation.
  • A review of the effectiveness of the use of costing techniques within an organisation
  • An investigation into the financial and operational costs and benefits of the internal audit / internal review activities within an organisation
  • An investigation into the possible effects of a proposed accounting standard on the financial statements and business activities of an organisation
  • An evaluation of the contribution made by human resource activities to the attainment of business and financial objectives
  • An appraisal of the business and financial objectives of a strategic investment decision made by an organisation and its impact on key stakeholders
  • Analyse and evaluation how an organisation has managed the use of data to inform business performance and achieve strategic advantage. Consideration must be given to how the ethics of data use is managed.
  • A critical evaluation of the financial and operational risk management within an organisation.
  • Select an organisation that has been identified as having weak corporate governance structures within the past 5 years. Critically evaluate their corporate governance practices including an assessment of the origins of the corporate governance issue(s) and the organisation’s response.
  • A review of the marketing strategy of an organisation and its effectiveness.
  • An analysis and evaluation of the financial and operational consequences of a merger between two organisations or of the acquisition of one organisation by another.
  • Select an organisation that has been identified as having weak social responsibility practices within the past 5 years. Critically evaluate their social responsibility practices, including an assessment of the origins of theproblem(s) and the organisation’s response and the impact of this on theorganisation’s key stakeholders.

*Please consult the latest ACCA Global Oxford Brookes Information Pack if submitting on these topics as specific industry sectors must be used.

For all topics other than 8 and 15, please also note the requirement to base your project on recent organisational activity or processes rather than historical events. (For this purpose ‘historical’ refers to activity that took place more than the three full calendar years preceding the start of the current submission period.) For example if you were proposing to submit a project in May 2019 for topic 19 the merger or acquisition would have to have taken place AFTER 1 JANUARY 2016

In addition to the Research & Analysis report you must also submit a 2,000 word (approx.) Skills and Learning Statement and Presentation slides (see below) at the same time.

Skills and Learning Statement

There are two elements to this.  You work will not actually be graded for the SLS section but you will be assessed as either ‘competent’ or ‘non-competent.’

Firstly there is the actual written Skills and Learning Statement, a 2,000 word statement, based on answering 4 set questions (see the Information Pack for these) that revolve around how you yourself responded to the challenges of producing the RAP and what you learned from the whole process from start to finish and the skills you acquired.  OBU stress that there are no right or wrong answers to these 4 questions but there are right or wrong approaches.  This assessment criterion is called ‘Self-reflection’ and as the name implies, requires you to so some self-assessment and self-evaluation which if inadequate, would result in you being classified as ‘non-competent’.

The second element is the PowerPoint Presentation which is intended to focus on the main findings of your RAP and be delivered to your mentor in 15 minutes.  The assessment criterion for this is ‘Communication Skills’ – essentially this means that the marker will be assessing whether your Presentation has been designed and used to present the information effectively to an audience within the time allotted.

Consequences of getting a Pass in the RAP and failure in the SLS and/or Presentation or vice versa

The three parts of the submission are independent.  This means that if you pass either the RAP, the written SLS or the Presentation but fail one or more of the other parts then the part passed does not have to be resubmitted and the pass will be carried forward to your next resubmission.  What this in effect means is that if you failed any part of the SLS section (even both parts) but were awarded a pass grade of C or above for your RAP, that actual grade will stand once you have passed the SLS.  However any failure in the RAP part (whether in technical and professional skills or graduate skills) means that you will have to resubmit the whole RAP and you cannot be awarded any grade higher than a C.

Resubmission

Although failing the RAP or SLS (either the written part or the Presentation) first time will obviously be a disappointment, it really is not ‘the end of the world’ for, as mentioned above, you will have 2 further attempts to make good any deficiencies.  Although the pass rates are around 55% currently (for all submissions), most students do go on to achieve a pass, either at their subsequent or third attempt.  You are not obliged to resubmit at the next submission period, (although most students do),  you can postpone your resubmission(s) until you feel ready or it is convenient for you provided that you have not exceeded the 10 year time limit.

More about OBU:

What is obu and eligibility, the research & analysis project (rap), the role of the project mentor and how to choose a mentor, the obu rap information pack, rap submission and assessment criteria, the resubmission statement – how a good one is your key to success on resubmission, the academic conduct office (aco), how open tuition can help you get your obu bsc (hons) degree, visit also  obu forums  for expert advice from contributors who have had years of experience in helping students prepare for their rap submission and in some cases contributors who themselves have gained their degree by going through the obu rap process., reader interactions, leave a reply cancel reply.

You must be logged in to post a comment.

  • Python For Data Analysis
  • Data Science
  • Data Analysis with R
  • Data Analysis with Python
  • Data Visualization with Python
  • Data Analysis Examples
  • Math for Data Analysis
  • Data Analysis Interview questions
  • Artificial Intelligence
  • Data Analysis Projects
  • Machine Learning
  • Deep Learning
  • Computer Vision
  • Types of Research - Methods Explained with Examples
  • GRE Data Analysis | Methods for Presenting Data
  • Financial Analysis: Objectives, Methods, and Process
  • Financial Analysis: Need, Types, and Limitations
  • Methods of Marketing Research
  • Top 10 SQL Projects For Data Analysis
  • What is Statistical Analysis in Data Science?
  • 10 Data Analytics Project Ideas
  • Predictive Analysis in Data Mining
  • How to Become a Research Analyst?
  • Data Analytics and its type
  • Types of Social Networks Analysis
  • What is Data Analysis?
  • Six Steps of Data Analysis Process
  • Multidimensional data analysis in Python
  • Attributes and its Types in Data Analytics
  • Exploratory Data Analysis (EDA) - Types and Tools
  • Data Analyst Jobs in Pune

Data Analysis in Research: Types & Methods

Data analysis is a crucial step in the research process, transforming raw data into meaningful insights that drive informed decisions and advance knowledge. This article explores the various types and methods of data analysis in research, providing a comprehensive guide for researchers across disciplines.

Data-Analysis-in-Research

Data Analysis in Research

Overview of Data analysis in research

Data analysis in research is the systematic use of statistical and analytical tools to describe, summarize, and draw conclusions from datasets. This process involves organizing, analyzing, modeling, and transforming data to identify trends, establish connections, and inform decision-making. The main goals include describing data through visualization and statistics, making inferences about a broader population, predicting future events using historical data, and providing data-driven recommendations. The stages of data analysis involve collecting relevant data, preprocessing to clean and format it, conducting exploratory data analysis to identify patterns, building and testing models, interpreting results, and effectively reporting findings.

  • Main Goals : Describe data, make inferences, predict future events, and provide data-driven recommendations.
  • Stages of Data Analysis : Data collection, preprocessing, exploratory data analysis, model building and testing, interpretation, and reporting.

Types of Data Analysis

1. descriptive analysis.

Descriptive analysis focuses on summarizing and describing the features of a dataset. It provides a snapshot of the data, highlighting central tendencies, dispersion, and overall patterns.

  • Central Tendency Measures : Mean, median, and mode are used to identify the central point of the dataset.
  • Dispersion Measures : Range, variance, and standard deviation help in understanding the spread of the data.
  • Frequency Distribution : This shows how often each value in a dataset occurs.

2. Inferential Analysis

Inferential analysis allows researchers to make predictions or inferences about a population based on a sample of data. It is used to test hypotheses and determine the relationships between variables.

  • Hypothesis Testing : Techniques like t-tests, chi-square tests, and ANOVA are used to test assumptions about a population.
  • Regression Analysis : This method examines the relationship between dependent and independent variables.
  • Confidence Intervals : These provide a range of values within which the true population parameter is expected to lie.

3. Exploratory Data Analysis (EDA)

EDA is an approach to analyzing data sets to summarize their main characteristics, often with visual methods. It helps in discovering patterns, spotting anomalies, and checking assumptions with the help of graphical representations.

  • Visual Techniques : Histograms, box plots, scatter plots, and bar charts are commonly used in EDA.
  • Summary Statistics : Basic statistical measures are used to describe the dataset.

4. Predictive Analysis

Predictive analysis uses statistical techniques and machine learning algorithms to predict future outcomes based on historical data.

  • Machine Learning Models : Algorithms like linear regression, decision trees, and neural networks are employed to make predictions.
  • Time Series Analysis : This method analyzes data points collected or recorded at specific time intervals to forecast future trends.

5. Causal Analysis

Causal analysis aims to identify cause-and-effect relationships between variables. It helps in understanding the impact of one variable on another.

  • Experiments : Controlled experiments are designed to test the causality.
  • Quasi-Experimental Designs : These are used when controlled experiments are not feasible.

6. Mechanistic Analysis

Mechanistic analysis seeks to understand the underlying mechanisms or processes that drive observed phenomena. It is common in fields like biology and engineering.

Methods of Data Analysis

1. quantitative methods.

Quantitative methods involve numerical data and statistical analysis to uncover patterns, relationships, and trends.

  • Statistical Analysis : Includes various statistical tests and measures.
  • Mathematical Modeling : Uses mathematical equations to represent relationships among variables.
  • Simulation : Computer-based models simulate real-world processes to predict outcomes.

2. Qualitative Methods

Qualitative methods focus on non-numerical data, such as text, images, and audio, to understand concepts, opinions, or experiences.

  • Content Analysis : Systematic coding and categorizing of textual information.
  • Thematic Analysis : Identifying themes and patterns within qualitative data.
  • Narrative Analysis : Examining the stories or accounts shared by participants.

3. Mixed Methods

Mixed methods combine both quantitative and qualitative approaches to provide a more comprehensive analysis.

  • Sequential Explanatory Design : Quantitative data is collected and analyzed first, followed by qualitative data to explain the quantitative results.
  • Concurrent Triangulation Design : Both qualitative and quantitative data are collected simultaneously but analyzed separately to compare results.

4. Data Mining

Data mining involves exploring large datasets to discover patterns and relationships.

  • Clustering : Grouping data points with similar characteristics.
  • Association Rule Learning : Identifying interesting relations between variables in large databases.
  • Classification : Assigning items to predefined categories based on their attributes.

5. Big Data Analytics

Big data analytics involves analyzing vast amounts of data to uncover hidden patterns, correlations, and other insights.

  • Hadoop and Spark : Frameworks for processing and analyzing large datasets.
  • NoSQL Databases : Designed to handle unstructured data.
  • Machine Learning Algorithms : Used to analyze and predict complex patterns in big data.

Applications and Case Studies

Numerous fields and industries use data analysis methods, which provide insightful information and facilitate data-driven decision-making. The following case studies demonstrate the effectiveness of data analysis in research:

Medical Care:

  • Predicting Patient Readmissions: By using data analysis to create predictive models, healthcare facilities may better identify patients who are at high risk of readmission and implement focused interventions to enhance patient care.
  • Disease Outbreak Analysis: Researchers can monitor and forecast disease outbreaks by examining both historical and current data. This information aids public health authorities in putting preventative and control measures in place.
  • Fraud Detection: To safeguard clients and lessen financial losses, financial institutions use data analysis tools to identify fraudulent transactions and activities.
  • investing Strategies: By using data analysis, quantitative investing models that detect trends in stock prices may be created, assisting investors in optimizing their portfolios and making well-informed choices.
  • Customer Segmentation: Businesses may divide up their client base into discrete groups using data analysis, which makes it possible to launch focused marketing efforts and provide individualized services.
  • Social Media Analytics: By tracking brand sentiment, identifying influencers, and understanding consumer preferences, marketers may develop more successful marketing strategies by analyzing social media data.
  • Predicting Student Performance: By using data analysis tools, educators may identify at-risk children and forecast their performance. This allows them to give individualized learning plans and timely interventions.
  • Education Policy Analysis: Data may be used by researchers to assess the efficacy of policies, initiatives, and programs in education, offering insights for evidence-based decision-making.

Social Science Fields:

  • Opinion mining in politics: By examining public opinion data from news stories and social media platforms, academics and policymakers may get insight into prevailing political opinions and better understand how the public feels about certain topics or candidates.
  • Crime Analysis: Researchers may spot trends, anticipate high-risk locations, and help law enforcement use resources wisely in order to deter and lessen crime by studying crime data.

Data analysis is a crucial step in the research process because it enables companies and researchers to glean insightful information from data. By using diverse analytical methodologies and approaches, scholars may reveal latent patterns, arrive at well-informed conclusions, and tackle intricate research inquiries. Numerous statistical, machine learning, and visualization approaches are among the many data analysis tools available, offering a comprehensive toolbox for addressing a broad variety of research problems.

Data Analysis in Research FAQs:

What are the main phases in the process of analyzing data.

In general, the steps involved in data analysis include gathering data, preparing it, doing exploratory data analysis, constructing and testing models, interpreting the results, and reporting the results. Every stage is essential to guaranteeing the analysis’s efficacy and correctness.

What are the differences between the examination of qualitative and quantitative data?

In order to comprehend and analyze non-numerical data, such text, pictures, or observations, qualitative data analysis often employs content analysis, grounded theory, or ethnography. Comparatively, quantitative data analysis works with numerical data and makes use of statistical methods to identify, deduce, and forecast trends in the data.

What are a few popular statistical methods for analyzing data?

In data analysis, predictive modeling, inferential statistics, and descriptive statistics are often used. While inferential statistics establish assumptions and draw inferences about a wider population, descriptive statistics highlight the fundamental characteristics of the data. To predict unknown values or future events, predictive modeling is used.

In what ways might data analysis methods be used in the healthcare industry?

In the healthcare industry, data analysis may be used to optimize treatment regimens, monitor disease outbreaks, forecast patient readmissions, and enhance patient care. It is also essential for medication development, clinical research, and the creation of healthcare policies.

What difficulties may one encounter while analyzing data?

Answer: Typical problems with data quality include missing values, outliers, and biased samples, all of which may affect how accurate the analysis is. Furthermore, it might be computationally demanding to analyze big and complicated datasets, necessitating certain tools and knowledge. It’s also critical to handle ethical issues, such as data security and privacy.

Please Login to comment...

Similar reads.

  • Data Science Blogathon 2024
  • AI-ML-DS Blogs
  • Data Analysis

Improve your Coding Skills with Practice

 alt=

What kind of Experience do you want to share?

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Neurol Res Pract

Logo of neurrp

How to use and assess qualitative research methods

Loraine busetto.

1 Department of Neurology, Heidelberg University Hospital, Im Neuenheimer Feld 400, 69120 Heidelberg, Germany

Wolfgang Wick

2 Clinical Cooperation Unit Neuro-Oncology, German Cancer Research Center, Heidelberg, Germany

Christoph Gumbinger

Associated data.

Not applicable.

This paper aims to provide an overview of the use and assessment of qualitative research methods in the health sciences. Qualitative research can be defined as the study of the nature of phenomena and is especially appropriate for answering questions of why something is (not) observed, assessing complex multi-component interventions, and focussing on intervention improvement. The most common methods of data collection are document study, (non-) participant observations, semi-structured interviews and focus groups. For data analysis, field-notes and audio-recordings are transcribed into protocols and transcripts, and coded using qualitative data management software. Criteria such as checklists, reflexivity, sampling strategies, piloting, co-coding, member-checking and stakeholder involvement can be used to enhance and assess the quality of the research conducted. Using qualitative in addition to quantitative designs will equip us with better tools to address a greater range of research problems, and to fill in blind spots in current neurological research and practice.

The aim of this paper is to provide an overview of qualitative research methods, including hands-on information on how they can be used, reported and assessed. This article is intended for beginning qualitative researchers in the health sciences as well as experienced quantitative researchers who wish to broaden their understanding of qualitative research.

What is qualitative research?

Qualitative research is defined as “the study of the nature of phenomena”, including “their quality, different manifestations, the context in which they appear or the perspectives from which they can be perceived” , but excluding “their range, frequency and place in an objectively determined chain of cause and effect” [ 1 ]. This formal definition can be complemented with a more pragmatic rule of thumb: qualitative research generally includes data in form of words rather than numbers [ 2 ].

Why conduct qualitative research?

Because some research questions cannot be answered using (only) quantitative methods. For example, one Australian study addressed the issue of why patients from Aboriginal communities often present late or not at all to specialist services offered by tertiary care hospitals. Using qualitative interviews with patients and staff, it found one of the most significant access barriers to be transportation problems, including some towns and communities simply not having a bus service to the hospital [ 3 ]. A quantitative study could have measured the number of patients over time or even looked at possible explanatory factors – but only those previously known or suspected to be of relevance. To discover reasons for observed patterns, especially the invisible or surprising ones, qualitative designs are needed.

While qualitative research is common in other fields, it is still relatively underrepresented in health services research. The latter field is more traditionally rooted in the evidence-based-medicine paradigm, as seen in " research that involves testing the effectiveness of various strategies to achieve changes in clinical practice, preferably applying randomised controlled trial study designs (...) " [ 4 ]. This focus on quantitative research and specifically randomised controlled trials (RCT) is visible in the idea of a hierarchy of research evidence which assumes that some research designs are objectively better than others, and that choosing a "lesser" design is only acceptable when the better ones are not practically or ethically feasible [ 5 , 6 ]. Others, however, argue that an objective hierarchy does not exist, and that, instead, the research design and methods should be chosen to fit the specific research question at hand – "questions before methods" [ 2 , 7 – 9 ]. This means that even when an RCT is possible, some research problems require a different design that is better suited to addressing them. Arguing in JAMA, Berwick uses the example of rapid response teams in hospitals, which he describes as " a complex, multicomponent intervention – essentially a process of social change" susceptible to a range of different context factors including leadership or organisation history. According to him, "[in] such complex terrain, the RCT is an impoverished way to learn. Critics who use it as a truth standard in this context are incorrect" [ 8 ] . Instead of limiting oneself to RCTs, Berwick recommends embracing a wider range of methods , including qualitative ones, which for "these specific applications, (...) are not compromises in learning how to improve; they are superior" [ 8 ].

Research problems that can be approached particularly well using qualitative methods include assessing complex multi-component interventions or systems (of change), addressing questions beyond “what works”, towards “what works for whom when, how and why”, and focussing on intervention improvement rather than accreditation [ 7 , 9 – 12 ]. Using qualitative methods can also help shed light on the “softer” side of medical treatment. For example, while quantitative trials can measure the costs and benefits of neuro-oncological treatment in terms of survival rates or adverse effects, qualitative research can help provide a better understanding of patient or caregiver stress, visibility of illness or out-of-pocket expenses.

How to conduct qualitative research?

Given that qualitative research is characterised by flexibility, openness and responsivity to context, the steps of data collection and analysis are not as separate and consecutive as they tend to be in quantitative research [ 13 , 14 ]. As Fossey puts it : “sampling, data collection, analysis and interpretation are related to each other in a cyclical (iterative) manner, rather than following one after another in a stepwise approach” [ 15 ]. The researcher can make educated decisions with regard to the choice of method, how they are implemented, and to which and how many units they are applied [ 13 ]. As shown in Fig.  1 , this can involve several back-and-forth steps between data collection and analysis where new insights and experiences can lead to adaption and expansion of the original plan. Some insights may also necessitate a revision of the research question and/or the research design as a whole. The process ends when saturation is achieved, i.e. when no relevant new information can be found (see also below: sampling and saturation). For reasons of transparency, it is essential for all decisions as well as the underlying reasoning to be well-documented.

An external file that holds a picture, illustration, etc.
Object name is 42466_2020_59_Fig1_HTML.jpg

Iterative research process

While it is not always explicitly addressed, qualitative methods reflect a different underlying research paradigm than quantitative research (e.g. constructivism or interpretivism as opposed to positivism). The choice of methods can be based on the respective underlying substantive theory or theoretical framework used by the researcher [ 2 ].

Data collection

The methods of qualitative data collection most commonly used in health research are document study, observations, semi-structured interviews and focus groups [ 1 , 14 , 16 , 17 ].

Document study

Document study (also called document analysis) refers to the review by the researcher of written materials [ 14 ]. These can include personal and non-personal documents such as archives, annual reports, guidelines, policy documents, diaries or letters.

Observations

Observations are particularly useful to gain insights into a certain setting and actual behaviour – as opposed to reported behaviour or opinions [ 13 ]. Qualitative observations can be either participant or non-participant in nature. In participant observations, the observer is part of the observed setting, for example a nurse working in an intensive care unit [ 18 ]. In non-participant observations, the observer is “on the outside looking in”, i.e. present in but not part of the situation, trying not to influence the setting by their presence. Observations can be planned (e.g. for 3 h during the day or night shift) or ad hoc (e.g. as soon as a stroke patient arrives at the emergency room). During the observation, the observer takes notes on everything or certain pre-determined parts of what is happening around them, for example focusing on physician-patient interactions or communication between different professional groups. Written notes can be taken during or after the observations, depending on feasibility (which is usually lower during participant observations) and acceptability (e.g. when the observer is perceived to be judging the observed). Afterwards, these field notes are transcribed into observation protocols. If more than one observer was involved, field notes are taken independently, but notes can be consolidated into one protocol after discussions. Advantages of conducting observations include minimising the distance between the researcher and the researched, the potential discovery of topics that the researcher did not realise were relevant and gaining deeper insights into the real-world dimensions of the research problem at hand [ 18 ].

Semi-structured interviews

Hijmans & Kuyper describe qualitative interviews as “an exchange with an informal character, a conversation with a goal” [ 19 ]. Interviews are used to gain insights into a person’s subjective experiences, opinions and motivations – as opposed to facts or behaviours [ 13 ]. Interviews can be distinguished by the degree to which they are structured (i.e. a questionnaire), open (e.g. free conversation or autobiographical interviews) or semi-structured [ 2 , 13 ]. Semi-structured interviews are characterized by open-ended questions and the use of an interview guide (or topic guide/list) in which the broad areas of interest, sometimes including sub-questions, are defined [ 19 ]. The pre-defined topics in the interview guide can be derived from the literature, previous research or a preliminary method of data collection, e.g. document study or observations. The topic list is usually adapted and improved at the start of the data collection process as the interviewer learns more about the field [ 20 ]. Across interviews the focus on the different (blocks of) questions may differ and some questions may be skipped altogether (e.g. if the interviewee is not able or willing to answer the questions or for concerns about the total length of the interview) [ 20 ]. Qualitative interviews are usually not conducted in written format as it impedes on the interactive component of the method [ 20 ]. In comparison to written surveys, qualitative interviews have the advantage of being interactive and allowing for unexpected topics to emerge and to be taken up by the researcher. This can also help overcome a provider or researcher-centred bias often found in written surveys, which by nature, can only measure what is already known or expected to be of relevance to the researcher. Interviews can be audio- or video-taped; but sometimes it is only feasible or acceptable for the interviewer to take written notes [ 14 , 16 , 20 ].

Focus groups

Focus groups are group interviews to explore participants’ expertise and experiences, including explorations of how and why people behave in certain ways [ 1 ]. Focus groups usually consist of 6–8 people and are led by an experienced moderator following a topic guide or “script” [ 21 ]. They can involve an observer who takes note of the non-verbal aspects of the situation, possibly using an observation guide [ 21 ]. Depending on researchers’ and participants’ preferences, the discussions can be audio- or video-taped and transcribed afterwards [ 21 ]. Focus groups are useful for bringing together homogeneous (to a lesser extent heterogeneous) groups of participants with relevant expertise and experience on a given topic on which they can share detailed information [ 21 ]. Focus groups are a relatively easy, fast and inexpensive method to gain access to information on interactions in a given group, i.e. “the sharing and comparing” among participants [ 21 ]. Disadvantages include less control over the process and a lesser extent to which each individual may participate. Moreover, focus group moderators need experience, as do those tasked with the analysis of the resulting data. Focus groups can be less appropriate for discussing sensitive topics that participants might be reluctant to disclose in a group setting [ 13 ]. Moreover, attention must be paid to the emergence of “groupthink” as well as possible power dynamics within the group, e.g. when patients are awed or intimidated by health professionals.

Choosing the “right” method

As explained above, the school of thought underlying qualitative research assumes no objective hierarchy of evidence and methods. This means that each choice of single or combined methods has to be based on the research question that needs to be answered and a critical assessment with regard to whether or to what extent the chosen method can accomplish this – i.e. the “fit” between question and method [ 14 ]. It is necessary for these decisions to be documented when they are being made, and to be critically discussed when reporting methods and results.

Let us assume that our research aim is to examine the (clinical) processes around acute endovascular treatment (EVT), from the patient’s arrival at the emergency room to recanalization, with the aim to identify possible causes for delay and/or other causes for sub-optimal treatment outcome. As a first step, we could conduct a document study of the relevant standard operating procedures (SOPs) for this phase of care – are they up-to-date and in line with current guidelines? Do they contain any mistakes, irregularities or uncertainties that could cause delays or other problems? Regardless of the answers to these questions, the results have to be interpreted based on what they are: a written outline of what care processes in this hospital should look like. If we want to know what they actually look like in practice, we can conduct observations of the processes described in the SOPs. These results can (and should) be analysed in themselves, but also in comparison to the results of the document analysis, especially as regards relevant discrepancies. Do the SOPs outline specific tests for which no equipment can be observed or tasks to be performed by specialized nurses who are not present during the observation? It might also be possible that the written SOP is outdated, but the actual care provided is in line with current best practice. In order to find out why these discrepancies exist, it can be useful to conduct interviews. Are the physicians simply not aware of the SOPs (because their existence is limited to the hospital’s intranet) or do they actively disagree with them or does the infrastructure make it impossible to provide the care as described? Another rationale for adding interviews is that some situations (or all of their possible variations for different patient groups or the day, night or weekend shift) cannot practically or ethically be observed. In this case, it is possible to ask those involved to report on their actions – being aware that this is not the same as the actual observation. A senior physician’s or hospital manager’s description of certain situations might differ from a nurse’s or junior physician’s one, maybe because they intentionally misrepresent facts or maybe because different aspects of the process are visible or important to them. In some cases, it can also be relevant to consider to whom the interviewee is disclosing this information – someone they trust, someone they are otherwise not connected to, or someone they suspect or are aware of being in a potentially “dangerous” power relationship to them. Lastly, a focus group could be conducted with representatives of the relevant professional groups to explore how and why exactly they provide care around EVT. The discussion might reveal discrepancies (between SOPs and actual care or between different physicians) and motivations to the researchers as well as to the focus group members that they might not have been aware of themselves. For the focus group to deliver relevant information, attention has to be paid to its composition and conduct, for example, to make sure that all participants feel safe to disclose sensitive or potentially problematic information or that the discussion is not dominated by (senior) physicians only. The resulting combination of data collection methods is shown in Fig.  2 .

An external file that holds a picture, illustration, etc.
Object name is 42466_2020_59_Fig2_HTML.jpg

Possible combination of data collection methods

Attributions for icons: “Book” by Serhii Smirnov, “Interview” by Adrien Coquet, FR, “Magnifying Glass” by anggun, ID, “Business communication” by Vectors Market; all from the Noun Project

The combination of multiple data source as described for this example can be referred to as “triangulation”, in which multiple measurements are carried out from different angles to achieve a more comprehensive understanding of the phenomenon under study [ 22 , 23 ].

Data analysis

To analyse the data collected through observations, interviews and focus groups these need to be transcribed into protocols and transcripts (see Fig.  3 ). Interviews and focus groups can be transcribed verbatim , with or without annotations for behaviour (e.g. laughing, crying, pausing) and with or without phonetic transcription of dialects and filler words, depending on what is expected or known to be relevant for the analysis. In the next step, the protocols and transcripts are coded , that is, marked (or tagged, labelled) with one or more short descriptors of the content of a sentence or paragraph [ 2 , 15 , 23 ]. Jansen describes coding as “connecting the raw data with “theoretical” terms” [ 20 ]. In a more practical sense, coding makes raw data sortable. This makes it possible to extract and examine all segments describing, say, a tele-neurology consultation from multiple data sources (e.g. SOPs, emergency room observations, staff and patient interview). In a process of synthesis and abstraction, the codes are then grouped, summarised and/or categorised [ 15 , 20 ]. The end product of the coding or analysis process is a descriptive theory of the behavioural pattern under investigation [ 20 ]. The coding process is performed using qualitative data management software, the most common ones being InVivo, MaxQDA and Atlas.ti. It should be noted that these are data management tools which support the analysis performed by the researcher(s) [ 14 ].

An external file that holds a picture, illustration, etc.
Object name is 42466_2020_59_Fig3_HTML.jpg

From data collection to data analysis

Attributions for icons: see Fig. ​ Fig.2, 2 , also “Speech to text” by Trevor Dsouza, “Field Notes” by Mike O’Brien, US, “Voice Record” by ProSymbols, US, “Inspection” by Made, AU, and “Cloud” by Graphic Tigers; all from the Noun Project

How to report qualitative research?

Protocols of qualitative research can be published separately and in advance of the study results. However, the aim is not the same as in RCT protocols, i.e. to pre-define and set in stone the research questions and primary or secondary endpoints. Rather, it is a way to describe the research methods in detail, which might not be possible in the results paper given journals’ word limits. Qualitative research papers are usually longer than their quantitative counterparts to allow for deep understanding and so-called “thick description”. In the methods section, the focus is on transparency of the methods used, including why, how and by whom they were implemented in the specific study setting, so as to enable a discussion of whether and how this may have influenced data collection, analysis and interpretation. The results section usually starts with a paragraph outlining the main findings, followed by more detailed descriptions of, for example, the commonalities, discrepancies or exceptions per category [ 20 ]. Here it is important to support main findings by relevant quotations, which may add information, context, emphasis or real-life examples [ 20 , 23 ]. It is subject to debate in the field whether it is relevant to state the exact number or percentage of respondents supporting a certain statement (e.g. “Five interviewees expressed negative feelings towards XYZ”) [ 21 ].

How to combine qualitative with quantitative research?

Qualitative methods can be combined with other methods in multi- or mixed methods designs, which “[employ] two or more different methods [ …] within the same study or research program rather than confining the research to one single method” [ 24 ]. Reasons for combining methods can be diverse, including triangulation for corroboration of findings, complementarity for illustration and clarification of results, expansion to extend the breadth and range of the study, explanation of (unexpected) results generated with one method with the help of another, or offsetting the weakness of one method with the strength of another [ 1 , 17 , 24 – 26 ]. The resulting designs can be classified according to when, why and how the different quantitative and/or qualitative data strands are combined. The three most common types of mixed method designs are the convergent parallel design , the explanatory sequential design and the exploratory sequential design. The designs with examples are shown in Fig.  4 .

An external file that holds a picture, illustration, etc.
Object name is 42466_2020_59_Fig4_HTML.jpg

Three common mixed methods designs

In the convergent parallel design, a qualitative study is conducted in parallel to and independently of a quantitative study, and the results of both studies are compared and combined at the stage of interpretation of results. Using the above example of EVT provision, this could entail setting up a quantitative EVT registry to measure process times and patient outcomes in parallel to conducting the qualitative research outlined above, and then comparing results. Amongst other things, this would make it possible to assess whether interview respondents’ subjective impressions of patients receiving good care match modified Rankin Scores at follow-up, or whether observed delays in care provision are exceptions or the rule when compared to door-to-needle times as documented in the registry. In the explanatory sequential design, a quantitative study is carried out first, followed by a qualitative study to help explain the results from the quantitative study. This would be an appropriate design if the registry alone had revealed relevant delays in door-to-needle times and the qualitative study would be used to understand where and why these occurred, and how they could be improved. In the exploratory design, the qualitative study is carried out first and its results help informing and building the quantitative study in the next step [ 26 ]. If the qualitative study around EVT provision had shown a high level of dissatisfaction among the staff members involved, a quantitative questionnaire investigating staff satisfaction could be set up in the next step, informed by the qualitative study on which topics dissatisfaction had been expressed. Amongst other things, the questionnaire design would make it possible to widen the reach of the research to more respondents from different (types of) hospitals, regions, countries or settings, and to conduct sub-group analyses for different professional groups.

How to assess qualitative research?

A variety of assessment criteria and lists have been developed for qualitative research, ranging in their focus and comprehensiveness [ 14 , 17 , 27 ]. However, none of these has been elevated to the “gold standard” in the field. In the following, we therefore focus on a set of commonly used assessment criteria that, from a practical standpoint, a researcher can look for when assessing a qualitative research report or paper.

Assessors should check the authors’ use of and adherence to the relevant reporting checklists (e.g. Standards for Reporting Qualitative Research (SRQR)) to make sure all items that are relevant for this type of research are addressed [ 23 , 28 ]. Discussions of quantitative measures in addition to or instead of these qualitative measures can be a sign of lower quality of the research (paper). Providing and adhering to a checklist for qualitative research contributes to an important quality criterion for qualitative research, namely transparency [ 15 , 17 , 23 ].

Reflexivity

While methodological transparency and complete reporting is relevant for all types of research, some additional criteria must be taken into account for qualitative research. This includes what is called reflexivity, i.e. sensitivity to the relationship between the researcher and the researched, including how contact was established and maintained, or the background and experience of the researcher(s) involved in data collection and analysis. Depending on the research question and population to be researched this can be limited to professional experience, but it may also include gender, age or ethnicity [ 17 , 27 ]. These details are relevant because in qualitative research, as opposed to quantitative research, the researcher as a person cannot be isolated from the research process [ 23 ]. It may influence the conversation when an interviewed patient speaks to an interviewer who is a physician, or when an interviewee is asked to discuss a gynaecological procedure with a male interviewer, and therefore the reader must be made aware of these details [ 19 ].

Sampling and saturation

The aim of qualitative sampling is for all variants of the objects of observation that are deemed relevant for the study to be present in the sample “ to see the issue and its meanings from as many angles as possible” [ 1 , 16 , 19 , 20 , 27 ] , and to ensure “information-richness [ 15 ]. An iterative sampling approach is advised, in which data collection (e.g. five interviews) is followed by data analysis, followed by more data collection to find variants that are lacking in the current sample. This process continues until no new (relevant) information can be found and further sampling becomes redundant – which is called saturation [ 1 , 15 ] . In other words: qualitative data collection finds its end point not a priori , but when the research team determines that saturation has been reached [ 29 , 30 ].

This is also the reason why most qualitative studies use deliberate instead of random sampling strategies. This is generally referred to as “ purposive sampling” , in which researchers pre-define which types of participants or cases they need to include so as to cover all variations that are expected to be of relevance, based on the literature, previous experience or theory (i.e. theoretical sampling) [ 14 , 20 ]. Other types of purposive sampling include (but are not limited to) maximum variation sampling, critical case sampling or extreme or deviant case sampling [ 2 ]. In the above EVT example, a purposive sample could include all relevant professional groups and/or all relevant stakeholders (patients, relatives) and/or all relevant times of observation (day, night and weekend shift).

Assessors of qualitative research should check whether the considerations underlying the sampling strategy were sound and whether or how researchers tried to adapt and improve their strategies in stepwise or cyclical approaches between data collection and analysis to achieve saturation [ 14 ].

Good qualitative research is iterative in nature, i.e. it goes back and forth between data collection and analysis, revising and improving the approach where necessary. One example of this are pilot interviews, where different aspects of the interview (especially the interview guide, but also, for example, the site of the interview or whether the interview can be audio-recorded) are tested with a small number of respondents, evaluated and revised [ 19 ]. In doing so, the interviewer learns which wording or types of questions work best, or which is the best length of an interview with patients who have trouble concentrating for an extended time. Of course, the same reasoning applies to observations or focus groups which can also be piloted.

Ideally, coding should be performed by at least two researchers, especially at the beginning of the coding process when a common approach must be defined, including the establishment of a useful coding list (or tree), and when a common meaning of individual codes must be established [ 23 ]. An initial sub-set or all transcripts can be coded independently by the coders and then compared and consolidated after regular discussions in the research team. This is to make sure that codes are applied consistently to the research data.

Member checking

Member checking, also called respondent validation , refers to the practice of checking back with study respondents to see if the research is in line with their views [ 14 , 27 ]. This can happen after data collection or analysis or when first results are available [ 23 ]. For example, interviewees can be provided with (summaries of) their transcripts and asked whether they believe this to be a complete representation of their views or whether they would like to clarify or elaborate on their responses [ 17 ]. Respondents’ feedback on these issues then becomes part of the data collection and analysis [ 27 ].

Stakeholder involvement

In those niches where qualitative approaches have been able to evolve and grow, a new trend has seen the inclusion of patients and their representatives not only as study participants (i.e. “members”, see above) but as consultants to and active participants in the broader research process [ 31 – 33 ]. The underlying assumption is that patients and other stakeholders hold unique perspectives and experiences that add value beyond their own single story, making the research more relevant and beneficial to researchers, study participants and (future) patients alike [ 34 , 35 ]. Using the example of patients on or nearing dialysis, a recent scoping review found that 80% of clinical research did not address the top 10 research priorities identified by patients and caregivers [ 32 , 36 ]. In this sense, the involvement of the relevant stakeholders, especially patients and relatives, is increasingly being seen as a quality indicator in and of itself.

How not to assess qualitative research

The above overview does not include certain items that are routine in assessments of quantitative research. What follows is a non-exhaustive, non-representative, experience-based list of the quantitative criteria often applied to the assessment of qualitative research, as well as an explanation of the limited usefulness of these endeavours.

Protocol adherence

Given the openness and flexibility of qualitative research, it should not be assessed by how well it adheres to pre-determined and fixed strategies – in other words: its rigidity. Instead, the assessor should look for signs of adaptation and refinement based on lessons learned from earlier steps in the research process.

Sample size

For the reasons explained above, qualitative research does not require specific sample sizes, nor does it require that the sample size be determined a priori [ 1 , 14 , 27 , 37 – 39 ]. Sample size can only be a useful quality indicator when related to the research purpose, the chosen methodology and the composition of the sample, i.e. who was included and why.

Randomisation

While some authors argue that randomisation can be used in qualitative research, this is not commonly the case, as neither its feasibility nor its necessity or usefulness has been convincingly established for qualitative research [ 13 , 27 ]. Relevant disadvantages include the negative impact of a too large sample size as well as the possibility (or probability) of selecting “ quiet, uncooperative or inarticulate individuals ” [ 17 ]. Qualitative studies do not use control groups, either.

Interrater reliability, variability and other “objectivity checks”

The concept of “interrater reliability” is sometimes used in qualitative research to assess to which extent the coding approach overlaps between the two co-coders. However, it is not clear what this measure tells us about the quality of the analysis [ 23 ]. This means that these scores can be included in qualitative research reports, preferably with some additional information on what the score means for the analysis, but it is not a requirement. Relatedly, it is not relevant for the quality or “objectivity” of qualitative research to separate those who recruited the study participants and collected and analysed the data. Experiences even show that it might be better to have the same person or team perform all of these tasks [ 20 ]. First, when researchers introduce themselves during recruitment this can enhance trust when the interview takes place days or weeks later with the same researcher. Second, when the audio-recording is transcribed for analysis, the researcher conducting the interviews will usually remember the interviewee and the specific interview situation during data analysis. This might be helpful in providing additional context information for interpretation of data, e.g. on whether something might have been meant as a joke [ 18 ].

Not being quantitative research

Being qualitative research instead of quantitative research should not be used as an assessment criterion if it is used irrespectively of the research problem at hand. Similarly, qualitative research should not be required to be combined with quantitative research per se – unless mixed methods research is judged as inherently better than single-method research. In this case, the same criterion should be applied for quantitative studies without a qualitative component.

The main take-away points of this paper are summarised in Table ​ Table1. 1 . We aimed to show that, if conducted well, qualitative research can answer specific research questions that cannot to be adequately answered using (only) quantitative designs. Seeing qualitative and quantitative methods as equal will help us become more aware and critical of the “fit” between the research problem and our chosen methods: I can conduct an RCT to determine the reasons for transportation delays of acute stroke patients – but should I? It also provides us with a greater range of tools to tackle a greater range of research problems more appropriately and successfully, filling in the blind spots on one half of the methodological spectrum to better address the whole complexity of neurological research and practice.

Take-away-points

Acknowledgements

Abbreviations, authors’ contributions.

LB drafted the manuscript; WW and CG revised the manuscript; all authors approved the final versions.

no external funding.

Availability of data and materials

Ethics approval and consent to participate, consent for publication, competing interests.

The authors declare no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Filter by Keywords

Project Management

Project analysis: a guide for project managers and teams.

March 13, 2024

Projects are inherently risky and come with their fair share of surprises. 

Projects that aren’t managed proactively can lead to missed deadlines, budget overruns, and poor project outcomes. 

This is where project analysis comes in handy. It allows you to pause, spot potential hiccups, and plan how to tackle them. As a result, you can reduce uncertainties, keep your projects on track, and achieve project goals . 🎯

Here, we’ll discuss project analysis and how it fits into the project management lifecycle. We’ll also share practical methods, best practices, and must-have tools to help you nail project analysis every time.

Ready to get a grip on your projects and consistently guide them to the finish line? Let’s go! 🏁

  • What Is Project Analysis?

In the beginning: Initiation and planning

Keeping it on track: execution, monitoring, and controlling, wrapping it up: closure, 1. strategic planning, 2. benchmarking analysis, 3. cost-benefit analysis (cba), 4. swot analysis, 5. stakeholder analysis, 6. critical path method (cpm), 7. workload analysis, 8. earned value analysis (eva), 9. root cause analysis (rca), 10. project evaluation, 1. scope creep, 2. data overload, 3. inaccurate risk assessment, 1. use a project management tool, 2. communicate regularly, 3. document everything.

  • 1. What is included in project analysis?
  • 2. What are the five steps of project analysis?

3. What is the process analysis of a project?

Avatar of person using AI

What Is Project Analysis ?

Project analysis is the process of evaluating a project to determine its feasibility, constraints, and potential risks. 

This analysis helps you optimize resource allocation, devise contingency plans, improve project execution , and ultimately achieve your project goals promptly and efficiently. 

A lot of the work in project analysis happens before the project kicks off. Usually, you’ll go through five main steps.

  • Define objectives: What’s the overarching goal of the project? Starting with a clear picture of the end goal is key to effective project analysis
  • Gather project details: Pull together all the relevant project details, including timelines, milestones, stakeholders, resources, and total cost estimates 💰
  • Analyze data: With all the project details in hand, it’s time to assess the project’s feasibility and potential risks. There are various methods (we’ll get to some of those in a bit) you can use to identify what these are and figure out how to tackle them
  • Make decisions: Armed with your analysis insights, you can make smarter choices like adjusting certain processes, reallocating resources, or canceling the project altogether
  • Document your findings: Record everything you’ve learned and decided on. This will be invaluable for guiding your current and future projects 💼

Understanding Project Analysis in Project Management

There are five stages in the project management lifecycle : initiation, planning, execution, monitoring and controlling, and closing.

While project analysis helps at each of these stages, it makes a huge difference at the beginning. This way, you can minimize the frequency and impact of issues during project execution.

Let’s take a closer look at the application of project analysis throughout the project management lifecycle.

During this phase, project analysis helps you clarify the project’s goals, scope, and requirements. 📝

It also helps you decide whether the project is worth pursuing, identify high-priority tasks that must be monitored closely, and plan to counter potential risks.

When your project is underway, project analysis allows you to check on its progress against your initial plans. 

Are you sticking to the initial timeline and budget? Have any of the risks you identified become problems? If so, how are you dealing with them? Have any other unforeseen issues popped up? What can you do about them?

These regular check-ins help you spot deviations early to adjust resource allocation and project execution to get back on track. ✔️

When a project ends, project analysis helps you reflect on the journey. A project retrospective lets you identify what worked, what didn’t work, and what improvements can be made for future projects.

Conducting Project Analysis : Key Methods and Approaches 

Conducting effective project analysis means knowing the best tricks of the trade. Here are 10 project analysis techniques for checking your project’s health at every stage of the project management lifecycle. 

Strategic planning involves aligning project objectives with your organization’s vision and long-term goals. This helps you and other stakeholders decide whether to proceed with a project and how to prioritize it among other projects.

For example, let’s say a retail company aims to be a leader in the eco-friendly space. As such, a long-term goal might be to cut carbon emissions by 20% in five years. 

With this in mind, they could consider different types of projects like using renewable energy to power business operations, implementing sustainable packaging solutions, or launching community programs like tree planting and eco-friendly workshops. 🌱

While strategic planning ensures your projects align with business objectives, benchmarking analysis ensures they align with industry standards.

You can implement this in two different ways: 

  • Performance assessment: With competitive analysis , you can compare your business’s current performance against competitors and industry leaders. This helps you spot weaknesses and plan for improvements
  • Project process improvement: Once a project is chosen, use benchmarking to analyze how others with industry-standard results executed similar projects. This helps you set realistic goals, estimate budget and required resources, and adopt best practices to increase your chances of success 

Let’s return to our example of the retail company aiming to cut carbon emissions. They would look at how top companies achieve this (e.g., solar energy or green supply chains) and their execution process to create their own roadmap to follow. 🗺️

Cost-benefit analysis is a tool for determining if a project is worth pursuing by evaluating its benefits against project costs.

Imagine our fictional retail store has to decide whether to invest in implementing solar panels, energy-saving lights, or sustainable packaging. 

Using CBA, they would calculate each project’s costs (money, time, and resources), quantify all associated benefits (like bill savings, a better brand image, and lower pollution taxes), and compute a benefits/cost value. 💸

The project with a high benefits/cost value will give the retail shop the most bang for its buck. 

SWOT analysis helps you identify internal (strengths and weaknesses) and external factors (opportunities and threats) that could affect a project’s success.

Using the retail store example, a SWOT analysis may look like this:

  • Strengths: Big budget and partnerships with eco-friendly suppliers
  • Weaknesses: Limited knowledge of renewable energy and resistance to change
  • Opportunities: Growing demand for sustainable products and tax incentives
  • Threats: Inflation and competition from more sustainable retailers

Retail stores can use these insights to take strategic action that leverages their strengths and opportunities and addresses weaknesses and threats.

Every project involves a group of stakeholders (e.g., investors, customers, suppliers, etc.) who significantly influence project outcomes. 

Figure out who these key players are, group them by influence and interest, and understand their expectations and communication preferences. This helps you build a communication plan that keeps everyone in the loop. 🎡

Using a stakeholder mapping template is a great place to start. You can also schedule recurring meetings with a tool like ClickUp’s calendar and receive reminders when the time rolls around. 

After laying out a project’s timeline, the critical path method lets you identify the order of tasks that must be completed on time to ensure the project stays on schedule. 🗓️

ClickUp's Gantt Chart view

A project management tool like ClickUp helps you identify your project’s critical path with the click of a button—all you need to do is enter ClickUp’s Gantt view, add all the required tasks for your project, and specify the duration for each task.

Workload analysis helps you see if your team can handle the demands of a project or if adjustments are necessary, like adding more resources or cutting back to avoid waste. 

Most times, these estimates aren’t always spot on. This is where ClickUp’s resource management features can be a lifesaver. 🦸‍♂️

ClickUp’s Workload view

You can use it during the project execution phase to get a real-time view of your team’s workload—see who’s getting swamped with work and who has time to spare. This allows you to improve team performance and keep projects on track.

Earned value analysis is a quantitative measure of whether a project progresses as planned. Use it at any point during the project execution phase to determine if a project is on schedule and running within budget.

If the project is over budget or behind schedule, this is a sign that issues need attention. Once you’ve identified setbacks, you can reallocate resources, secure more funds, and change task priorities to get the project back under control.

Regardless of how well you plan a project, unexpected issues may still pop up during project execution. Instead of just slapping a quick fix on these problems, Root cause analysis helps you dig deep to uncover their real reasons. 

Let’s say your project is behind schedule, and the symptom is missed deadlines. However, through RCA, you might discover that the root cause is unclear task assignments or a bottleneck in approvals.

Understanding the root causes allows you to address communication or workflow issues so your project can get back on track without the same old headaches. 💆

Project evaluation closes the project’s lifecycle and lets you assess its success. You can conduct a project evaluation by gathering relevant data (like surveys, interviews, and business reports) and analyzing them to see if the project hit its goals.

Remember the retail shop example we used earlier? Now is the time to check if the chosen “green project” met expectations. By how much did it reduce costs or emissions? What’s the feedback from staff and customers? 

Comparing how your results stack up against your goal lets you see what went right and where things could have been better. Documenting these insights lets you spot the wins and determine where to step up your game in future projects. 💪

Common Project Analysis Challenges

Conducting proper project analysis isn’t always smooth sailing. You’ll likely hit a few bumps along the way. 🚢

However, you can steer the ship in the right direction. Here are three common project management challenges you might encounter and practical tips on addressing them.

Scope creep is when your project grows beyond its original plans over time. This can cause you to miss deadlines, go over budget, and even lead to project failure. Setting clear, agreed-upon goals and requirements is critical to preventing this.

Implementing a change management process will help you evaluate any proposed project changes. This way, you can decide whether to approve or discard them based on their impact on the project’s timeline, budget, and resources.

Dealing with heaps of project data, like endless lists of tasks and piles of feedback, can make your head spin. 

It’s tough to figure out what you need to focus on, leading to slow decision-making, overlooked insights, and wasted time on things that don’t matter. This chaos can significantly slow down your project’s progress and performance. 📉

ClickUp's Performance Dashboard

Here’s where ClickUp’s Dashboards help you cut through the clutter and focus on what matters. Use it to get a visual overview of key project metrics without getting lost in the data jungle.

Most project analysis is about predicting risks and planning for them before a project is implemented. A thorough risk assessment is essential to prepare for and reduce threats to your project’s success. 🚨

One of the best ways to counter this is to encourage input from all stakeholders and team members during risk identification and planning. You can also hire external consultants to weigh in on this process. This increases the accuracy of your risk assessment and enhances the efficiency of your response plans. 

Project Analysis : Best Practices

Now that we have those tricky project analysis challenges out of the way, let’s dive into best practices to make the process simpler and smoother. 🧘

The right project management tool will help you track what analysis tasks need to be done, who’s responsible for them, and when they need to be completed. 

ClickUp's List view

ClickUp’s project management features make organizing a breeze. Plus, with over 1,000 project management templates , ClickUp gives you a head start in setting things up. 

For example, use the Analysis Framework Template to dive right into the analysis process without wasting time. ⏰

Regular and open communication is the lifeblood of successful project analysis. It helps the team stay updated on project status, spot problems early, and make smart decisions. 

ClickUp makes this easy with features like task comments and native chat, allowing quick feedback and discussions right where the work happens. This means no more digging through emails to find information or updates. 👀 

Project analysis: ClickUp’s Whiteboards

Another must-have feature is ClickUp’s Whiteboards . This is ideal for real-time visual brainstorming sessions with the project team and stakeholders. You can gather everyone’s input when setting out project plans, identifying potential problems, and solving issues.

Documentation is critical to project analysis. Document all your research, findings, decisions, and recommendations in a clear and organized project report. It promotes transparency within your team and is a valuable resource for future reference.

While this sounds like a lot, it doesn’t have to be. ClickUp AI helps you generate research-backed content in seconds, summarize research notes, and polish your writing in a snap. ✍️

You can round up all your documents in ClickUp Docs , share them with your team, and collaborate on updates when needed. 

Common FAQs

Here are some answers to common questions about project analysis.

1. What is included in project analysis ?

Project analysis assesses a project’s feasibility, identifies potential risks, and outlines a plan to counter them. Much of this happens during the planning phase to minimize surprises during project execution. Key tools for project analysis include strategic planning, benchmarking analysis, and critical path method.

2. What are the five steps of project analysis ?

The five steps of project analysis in the planning phase are:

  • Define objectives: Be clear on what you want to achieve with the project and project delivery requirements
  • Gather project details: This includes project schedules, project completion date, resources, and stakeholders
  • Analyze data: Assess feasibility and potential risks with project analysis tools and methods 
  • Make decisions: Decide whether to cancel the project or make changes and proceed
  • Document findings: Keep records of your analysis and decisions to guide project execution

Process analysis is a type of analysis methodology that evaluates project processes to identify inefficiencies and bottlenecks. This way, project managers can make corrections to optimize workflows and ensure the project stays on track. Examples of process analysis methods are critical path method and root cause analysis.

Handle Project Analysis Like a Pro With ClickUp

Drafting a new project plan and diving headfirst into execution is a recipe for disaster—it’s like diving into a lake without checking for rocks and alligators! 🐊

To ensure projects run smoothly and meet project goals, you must conduct comprehensive data analysis, project risk analysis, and cost analysis using the right tools and best practices. 

And let’s face it, juggling all these tools can get overwhelming without a bit of help. Let ClickUp handle the heavy lifting so you can keep a pulse on project performance and focus on work that moves the needle. 

Sign up for a free ClickUp plan to make your next project a winner. 🏆 

Questions? Comments? Visit our Help Center for support.

Receive the latest WriteClick Newsletter updates.

Thanks for subscribing to our blog!

Please enter a valid email

  • Free training & 24-hour support
  • Serious about security & privacy
  • 99.99% uptime the last 12 months

IARPA Logo

Intelligence Value

REASON aims to develop novel technologies that will enable intelligence analysts to substantially improve the evidence and reasoning in draft analytic reports. Intelligence analysts sort through huge amounts of often uncertain and conflicting information as they strive to answer intelligence questions. REASON will assist and enhance analysts’ work by pointing them to key pieces of evidence beyond what they have already considered and by helping them determine which alternative explanations have the strongest support. It will do this automatically and on demand by providing evidence and reasoning suggestions as the analyst works on a report. The program will exploit recent advances in artificial intelligence, not to perform the analysis or write the report, but to help analysts do it even better. As a result, decision-makers will receive analytic reports with the highest accuracy, clarity and timeliness.

Decision makers rely on the Intelligence Community to help them understand a wide variety of complex issues. Intelligence analysts face numerous challenges in their efforts to produce high-quality analytic reports. One major challenge is finding all relevant evidence from an ever-growing collection of often uncertain and conflicting information drawn from classified and unclassified sources. A second challenge is making well-reasoned judgments in the face of uncertainty. REASON will develop technology that analysts can use to discover additional relevant evidence (including contrary evidence) and to identify strengths and weaknesses in reasoning.  

REASON is not designed to replace analysts, write complete reports, or to increase their workload. The technology will work within the analyst’s current workflow.  It will function in the same manner as an automated grammar checker but with a focus on evidence and reasoning.

Performer teams will conduct research and development to build systems that will be evaluated by an independent testing and evaluation (T&E) team. Independent T&E will ensure that REASON is effective in helping analysts discover valuable evidence, identify strengths and weaknesses in reasoning, and produce higher quality reports. 

HYBRID PROPOSERS' DAY INFORMATION:

Sam.Gov Reference

REASON Teaming Form

REASON Technical Description 

TEAMING INFORMATION

REASON Teaming Summary

Proposers' Day Briefings

Accenture Lightning Talk

Avramov and Adair Lightning Talk

ARA Lightning Talk

ARG-tech Capabilities Statement

ARG-tech Lightning Talk

Blue Ridge Dynamics, Inc. Capabilities Statement

Blue Ridge Dynamics, Inc. Lightning Talk

City University of New York Lightning Talk

C3AI Capabilities Statement

C3 AI Lightning Talk

Datyra Capabilities Statement

Descartes Labs Capabilities Statement

DiBella Lightning Talk

Duke University Capabilities Statement

Gruetzemacher Capabilities Statement

Guidehouse Inc. Lightning Talk

Jacobs Lightning Talk

KRC Research Capabilities Statement

Language Computer Lightning Talk

Logic Intelligence Capabilities Statement

Logic Intelligence Lightning Talk

Mali Lightning Talk

Matellio Capabilities Statement

MORSE Corp Lightning Talk 

New Jersey Institute of Technology Lightning Talk

Old Dominion University Lightning Talk

Ox Capabilities Statement

Ox Lightning Talk

Parallax Advanced Research Lightning Talk

Penn State University Lightning Talk

Peraton Labs Lightning Talk

Protagonist Technologies, LLC Capabilities Statement

Pyrik Capabilities Statement

Pytho Lightning Talk

Safe-esteem Capabilities Statement

Safe-esteem Lightning Talk

Scale, Inc.Lightning Talk

Sentimetrix, Inc. Capabilities Statement

SRI Lightning Talk

Stevens Institute of Technology Lightning Talk

Storage Strategies, Inc. Capabilities Statement

Synthetic Decision Group Capabilities Statement

Tecuci Lightning Talk

Topologe, LLC Lightning Talk

University of Colorado Boulder Lightning Talk

University of Notre Dame Lightning Talk

Whysaurus Lightning Talk

REASON Logo

Contact Information

Program manager.

Dr. Steven Rieber

[email protected]

301-243-2087

Broad Agency Announcement (BAA)

Link(s) to baa.

W911NF-23-S-0007

Solicitation Status

Proposers' day date.

January 11, 2023

BAA Release Date

March 20, 2023

Proposal Due Date

May 8, 2023

Program Summary

Additional information.

REASON BAA Amendment 2 - Q&A

Protecting people from a changing climate: The case for resilience

About the authors.

This article is a collaborative effort by Harry Bowcott , Lori Fomenko, Alastair Hamilton , Mekala Krishnan , Mihir Mysore , Alexis Trittipo, and Oliver Walker.

The United Nations’ 2021 Intergovernmental Panel on Climate Change (IPCC) report stated —with higher confidence than ever before—that, without meaningful decarbonization, global temperatures will rise to at least 1.5°C above preindustrial levels within the next two decades. 1 Climate change 2021: The physical science basis , Intergovernmental Panel on Climate Change (IPCC), August 2021, ipcc.ch. This could have potentially dangerous and irreversible effects. A better understanding of how a changing climate could affect people around the world is a necessary first step toward defining solutions for protecting communities and building resilience. 2 For further details on how a changing climate will impact a range of socioeconomic systems, see “ Climate risk and response: Physical hazards and socioeconomic impacts ,” McKinsey Global Institute, January 16, 2020.

As part of our knowledge partnership with Race to Resilience at the UN Climate Change Conference of the Parties (COP26) in Glasgow, we have built a detailed, global assessment of the number of people exposed to four key physical climate hazards, primarily under two different warming scenarios. This paper lays out our methodology and our conclusions from this independent assessment.

A climate risk analysis focused on people: Our methodology in brief

Our research consists of a global analysis of the exposure of people’s lives and livelihoods to multiple hazards related to a changing climate. This analysis identifies people who are potentially vulnerable to four core climate hazards—heat stress, urban water stress, agricultural drought, and riverine and coastal flooding—even if warming is kept within 2.0°C above preindustrial levels.

Our methodology

The study integrates climate and socioeconomic data sources at a granular level to evaluate exposure to climate hazards. We used an ensemble mean of a selection of Coupled Model Intercomparison Project Phase 5 (CMIP5) global climate models under Representative Concentration Pathway (RCP) 8.5 —using a Shared Socioeconomic Pathway (SSP2) for urban water stress—with analysis conducted under two potential warming scenarios: global mean temperature increases above preindustrial levels of 1.5°C and 2.0°C. We sometimes use the shorthand of “1.5°C warming scenario” and “2.0°C warming scenario” to describe these scenarios. Our modeling of temperatures in 2030 refers to a multidecadal average between 2021 and 2040. When we say 2050, we refer to a multidecadal average between 2041 and 2060. These are considered relative to a reference period, which is dependent on hazard basis data availability (which we sometimes refer to as “today”).

We built our analysis by applying 2030 and 2050 population-growth projections to our 1.5°C and 2.0°C warming scenarios, respectively. This amount of warming by those time periods is consistent with an RCP 8.5 scenario, relative to the preindustrial average. Climate science makes extensive use of scenarios. We chose a higher emissions scenario of RCP 8.5 to measure the full inherent risk from a changing climate. Research also suggests that cumulative historical emissions, which indicate the actual degree of warming, have been in line with RCP 8.5. 1 For further details, see “ Climate risk and response ,” January 16, 2020, appendix; see also Philip B. Duffy, Spencer Glendon, and Christopher R. Schwalm, “RCP8.5 tracks cumulative CO2 emissions,” Proceedings of the National Academy of Sciences of the United States of America (PNAS) , August 2020, Volume 117, Number 33, pp. 19656–7, pnas.org. In some instances, we have also considered a scenario in which decarbonization actions limit warming and 1.5°C of warming relative to the preindustrial levels is only achieved in 2050, rather than in 2030. For our analysis we used models which differ to some extent on their exact amount of warming and timing, even across the same emissions scenario (RCP 8.5). Naturally, all forward-looking climate models are subject to uncertainty, and taking such an ensemble approach to our model allows us to account for some of that model uncertainty and error. 2 For a more detailed discussion of these uncertainties, see chapter 1 of “ Climate risk and response: Physical hazards and socioeconomic impacts ,” McKinsey Global Institute, January 16, 2020. However, the mean amount of warming typically seen across our ensemble of models is approximately 1.5°C by 2030 and 2.0°C by 2050.

Our analysis consisted of three major steps (see technical appendix for details on our methodology):

First, we divided the surface of the planet into a grid composed of five-kilometer cells, with climate hazards and socioeconomic data mapped for each cell.

Second, in each of those cells, we combined climate and socioeconomic data to estimate the number and vulnerability of people likely to be exposed to climate hazards. These data were categorized on the basis of severity and classified on the basis of exposure to one or more hazards at the grid-cell level.

Third, taking into account people’s vulnerability, we examined the potential impact of our four core hazards on the current and future global population. To do this, we assessed, globally, the number and vulnerability of people affected by different types and severities of hazards. We then aggregated the data from each cell up to the subnational, national, subcontinental, continental, and global levels to allow for comparison across countries.

It’s important to note that we carefully selected these four hazards because they capture the bulk of hazards likely to affect populations on a global scale. We did not account for a range of other hazards such as wildfires, extreme cold, and snow events. Further, our analysis accounts only for first-order effects of climate hazards and does not take into account secondary or indirect effects, which can have meaningful impact. Drought, for example, can lead to higher food prices and even migration—none of which are included in our analysis. Thus, the number of people affected by climate hazards is potentially underestimated in this work.

A focus on four main climate hazards

For our study, we used global data sets covering four key hazards: heat stress, urban water stress, agricultural drought, and riverine and coastal flooding. We relied on data from a selection of CMIP5 climate models, unless otherwise specified. For further details, see the technical appendix.

Heat stress

Heat stress can have meaningful impacts on lives and livelihoods as the climate changes. Heat stress is measured using wet-bulb temperature, which combines heat and humidity. We assess heat stress in the form of acute exposure to humid heat-wave occurrence as well as potential chronic loss in effective working hours, both of which depend on daily wet-bulb temperatures. Above a wet-bulb temperature of 35°C, heat stress can be fatal.

Acute humid heat waves are defined by the average wet-bulb temperature of the hottest six-hour period during a rolling three-day period in which the daily maximum wet-bulb temperature exceeds 34°C for three consecutive days. 3 Analysis of lethal heat waves in our previous McKinsey Global Institute report (see “ Climate risk and response ,” January 16, 2020) was limited to urban populations, and the temperature threshold was set to 34°C wet-bulb temperature under the assumption that the true wet-bulb temperature would actually be 35°C due to an additional 1°C from the urban heat-island effect. Heat-wave occurrence was calculated for each year for both a reference time period 4 The reference period for heat stress refers to the average between 1998 and 2017. and our two future time periods and translated into annual probabilities. Exposure was defined as anyone living in either an urban or rural location with at least a 2 percent annual probability of experiencing such a humid heat wave in any given year. Acute humid heat waves of 34°C or higher can be detrimental to health, even for a healthy and well-hydrated human resting in the shade, because the body begins to struggle with core body-temperature regulation and the likelihood of experiencing a heat stroke increases.

Chronic heat stress was assessed for select livelihoods and defined by processing daily mean air temperature and relative humidity data into a heat index and translating that into the fraction of average annual effective working hours lost due to heat exposure. This calculation was conducted following the methods of John P. Dunne et al., 5 John P. Dunne, Ronald J. Stouffer, and Jasmin G. John, “Reductions in labour capacity from heat stress under climate warming,” Nature Climate Change , 2013, Volume 3, Number 6, pp. 563–6, nature.com. using empirically corrected International Organization for Standardization (ISO) heat-exposure standards from Josh Foster et al. 6 Josh Foster et al., “A new paradigm to quantify the reduction of physical work capacity in the heat,” Medicine and Science in Sports and Exercise , 2019, Volume 51, Number 6S, p. 15, journals.lww.com.

We combined groups of people who were exposed to both chronic and acute heat stress to assess the aggregate number of people exposed. Heat stress can affect livelihoods, particularly for those employed in outdoor occupations, most prominently because an increased need for rest and a reduction in the body’s efficiency reduce effective working hours. Therefore, our analysis of potential exposure to chronic heat stress was limited to people estimated to be working in agriculture, crafts and trades, elementary, factory-based, and manufacturing occupations likely to experience at least a 5 percent loss of effective working hours on average annually. We excluded managers, professional staff, and others who are more likely to work indoors, in offices, or in other cooled environments from this analysis.

Urban water stress

Urban water stress 7 The reference period for water stress refers to the average between 1950 and 2010. often occurs in areas in which demand for water from residents, local industries, municipalities, and others exceeds the available supply. This issue can become progressively worse over time as demand for water continues to increase and supply either remains constant, decreases due to a changing climate, or even increases but not quickly enough to match demand. This can reduce urban residents’ access to drinking water or slow production in urban industry and agriculture.

Our analysis of water stress is limited to urban areas partially because water stress is primarily a demand-driven issue that is more influenced by socioeconomic factors than by changes in climate. We also wanted to avoid methodological overlap with our agricultural drought analysis, which mostly focused on rural areas.

We define urban water stress as the ratio of water demand to supply for urban areas globally. We used World Resources Institute (WRI) data for baseline water stress today and the SSP2 scenario for future water stress outlooks, where 2030 represents the 1.5°C warming scenario and 2040 represents the 2.0°C warming scenario. We only considered severe water stress, defined as withdrawals of 80 percent or more of the total supply, which WRI classifies as “extremely high” water stress.

We make a distinction for “most severe” urban water stress, defined as withdrawals of more than 100 percent of the total supply, to show how many people could be affected by water running out—a situation that will require meaningful interventions to avoid. However, for the sake of the overall exposure analysis, people exposed to the most severe category are considered to be exposed to “severe” water stress unless otherwise noted (exhibit).

Agricultural drought

Agricultural drought 8 The reference period for agricultural drought refers to the average between 1986 and 2005. is a slow-onset hazard defined by a period of months or years that is dry relative to a region’s normal precipitation and soil-moisture conditions, specifically, anomalously dry soils in areas where crops are grown. Drought can inhibit plant growth and reduce plant production, potentially leading to poor yields and crop failures. For more details, see the technical appendix.

Riverine and coastal flooding

We define flooding as the presence of water at least one centimeter deep on normally dry land. We analyze two types of flooding here: riverine flooding from rivers bursting their banks and coastal flooding from storm surges and rising sea levels pushing water onto coastal land. Both coastal and riverine flooding can damage property and infrastructure. In severe cases, they could lead to loss of life. 9 The reference period for riverine flooding refers to the average between 1960 and 1999; the reference period for coastal flooding refers to the average between 1979 and 2014. For more details, see the technical appendix.

Based on a combination of frequency and intensity metrics, we estimated three severity levels of each climate hazard: mild, moderate, and severe (exhibit).

Even when we only look at first-order effects, it is clear that building resilience and protecting people from climate hazards are critical. Our analysis provides data that may be used to identify the areas of highest potential exposure and vulnerability and to help build a case for investing in climate resilience on a global scale.

Our findings suggest the following conclusions:

  • Under a scenario with 1.5°C of warming above preindustrial levels by 2030, almost half of the world’s population could be exposed to a climate hazard related to heat stress, drought, flood, or water stress in the next decade, up from 43 percent today 3 Climate science makes extensive use of scenarios; we have chosen Representative Concentration Pathway (RCP) 8.5 and a multimodel ensemble to best model the full inherent risk absent mitigation and adaption. Scenario 1 consists of a mean global temperature rise of 1.5°C above preindustrial levels, which is reached by about 2030 under this RCP; Scenario 2 consists of a mean global temperature rise of 2.0°C above preindustrial levels, reached around 2050 under this RCP. Following standard practice, future estimates for 2030 and 2050 represent average climatic behavior over multidecadal periods: 2030 represents the average of the 2021–2040 period, and 2050 represents the average of the 2041–2060 period. We also compare results with today, also based on multidecadal averages, which differ by hazard. For further details, see technical appendix. —and almost a quarter of the world’s population would be exposed to severe hazards. (For detailed explanations of these hazards and how we define “severe,” see sidebar “A climate risk analysis focused on people: Our methodology in brief.”)
  • Indeed, as severe climate events become more common, even in a scenario where the world reaches 1.5°C of warming above preindustrial levels by 2050 rather than 2030, nearly one in four people could be exposed to a severe climate hazard that could affect their lives or livelihoods.
  • Climate hazards are unevenly distributed. On average, lower-income countries are more likely to be exposed to certain climate hazards compared with many upper-income countries, primarily due to their geographical location but also to the nature of their economies. (That said, both warming scenarios outlined here are likely to expose a larger share of people in nearly all nations to one of the four modeled climate hazards compared with today.) Those who fall within the most vulnerable categories are also more likely to be exposed to a physical climate hazard.

These human-centric data can help leaders identify the best areas of focus and the scale of response needed to help people—particularly the most vulnerable—build their climate resilience.

A larger proportion of the global population could be exposed to a severe climate hazard compared with today

Under a scenario with 1.5°C of warming above preindustrial levels by 2030, almost half of the world’s population—approximately 5.0 billion people—could be exposed to a climate hazard related to heat stress, drought, flood, or water stress in the next decade, up from 43 percent (3.3 billion people) today.

In much of the discussion below, we focus on severe climate hazards to highlight the most significant effects from a changing climate. We find that regardless of whether warming is limited to 1.5°C or reaches 2.0°C above preindustrial levels by 2050, severe hazard occurrence is likely to increase, and a much larger proportion of the global population could be exposed compared with today (Exhibit 1).

This proportion could more than double, with approximately one in three people likely to be exposed to a severe hazard under a 2.0°C warming scenario by 2050, compared with an estimated one in six exposed today. This amounts to about 2.0 billion additional people likely to be exposed by 2050. Even in a scenario where aggressive decarbonization results in just 1.5°C of warming above preindustrial levels by 2050, the number of people exposed to severe climate hazards could still increase to nearly one in four of the total projected global population, compared with one in six today.

One-sixth of the total projected global population, or about 1.4 billion people, could be exposed to severe heat stress, either acute (humid heat waves) or chronic (lost effective working hours), under a 2.0°C warming scenario above preindustrial levels by 2050, compared with less than 1 percent, or about 0.1 billion people, likely to be exposed today (Exhibit 2).

Our results suggest that both the severity and the geographic reach of severe heat stress may increase to affect more people globally, despite modeled projections of population growth, population shifts from rural to urban areas, and economic migration. Our analysis does not attempt to account for climate-change-related migration or resilience interventions, which could decrease exposure by either forcing people to move away from hot spots or mitigating impacts from severe heat stress.

For those with livelihoods affected by severe chronic heat stress, it could become too hot to work outside during at least 25 percent of effective working hours in any given year. This would likely affect incomes and might even require certain industries to rethink their operations and the nature of workers’ roles. For outdoor workers, extreme heat exposure could also result in chronic exhaustion and other long-term health issues. Heat stress can cause reductions in worker productivity and hours worked due to physiological limits on the human body, as well as an increased need for rest.

We have already seen some of the impacts of acute heat stress in recent years. In the summer of 2010 in Russia, tens of thousands of people died of respiratory illness or heat stress during a large heat-wave event in which temperatures rose to more than 10°C (50°F) higher than average temperatures for those dates. One academic study claims “an approximate 80 percent probability” that the new record high temperature “would not have occurred without climate warming.” 4 Dim Coumou and Stefan Rahmstorf, “Increase of extreme events in a warming world,” Proceedings of the National Academy of Sciences of the United States of America (PNAS) , November 2011, Volume 108, Number 44, pp. 17905–9, pnas.org. To date these impacts have been isolated events, but the potential impact of heat stress on a much broader scale is possible in a 1.5°C or 2.0°C warming scenario in the coming decades.

While we did not assess second-order impacts, they could also be meaningful. Secondary impacts from heat stress may include loss of power, and therefore air conditioning, due to greater stress on electrical grids during acute heat waves, 5 Sofia Aivalioti, Electricity sector adaptation to heat waves , Sabin Center for Climate Change Law, Columbia University, 2015, academiccommons.columbia.edu. increased stress on hospitals due to increased emergency room visits and admission rates primarily during acute heat-stress events, 6 Climate change and extreme heat events , Centers for Disease Control and Prevention, 2015, cdc.gov. and migration driven primarily by impacts from chronic heat stress. 7 Mariam Traore Chazalnoël, Dina Ionesco, and Eva Mach, Extreme heat and migration , International Organization for Migration, United Nations, 2017, environmentalmigration.iom.int.

The rate of growth in global urban water demand is highly likely to outpace that of urban water supply under future warming and socioeconomic pathway scenarios, compared with the overall historical baseline period (1950–2010). In most geographies, this problem is primarily caused not by climate change but by population growth and a corresponding growth in demand for water. However, in some geographies, urban water stress can be exacerbated by the impact of climate change on water supply. In a 2.0°C warming scenario above preindustrial levels by 2050, about 800 million additional people could be living in urban areas under severe water stress compared with today (Exhibit 3). This could result in lack of access to water supplies for drinking, washing and cleaning, and maintaining industrial operations. In some areas, this could make a case for investment in infrastructure such as pipes and desalination plants to make up for the deficit.

Agricultural drought is most likely to directly affect people employed in the agricultural sector: in conditions of anomalously dry soils, plants do not have an adequate water supply, which inhibits plant growth and reduces production. This in turn could have adverse impacts on agricultural livelihoods.

In a scenario with warming 2.0°C above preindustrial levels by 2050, nearly 100 million people—or approximately one in seven of the total global rural population projected to be employed in the agricultural sector by 2050—could be exposed to a severe level of drought, defined as an average of seven to eight drought years per decade. This could severely diminish people’s ability to maintain a livelihood in rainfed agriculture. Additional irrigation would be required, placing further strain on water demand, and yields could still be reduced if exposed to other heat-related hazards.

While our analysis focused on the first-order effects of agricultural drought, the real-world impact could be much larger. Meaningful second-order effects of agricultural drought include reduced access to drinking water and widespread malnutrition. In addition, drought in regions with insufficient aid can cause infectious disease to spread.

Further, although our analysis did not cover food security, many other studies have posited that if people are unable to appropriately adapt, this level of warming would raise the risk of breadbasket failures and could lead to higher food prices. 8 For more on how a changing climate might affect global breadbaskets, see “ Will the world’s breadbaskets become less reliable? ,” McKinsey Global Institute, May 18, 2020.

Primarily as a result of surging demand exacerbated by climate change, 9 Salvatore Pascale et al., “Increasing risk of another Cape Town ‘Day Zero’ drought in the 21st century, Proceedings of the National Academy of Sciences of the United States of America (PNAS) , November 2020, Volume 117, Number 47, pp. 29495–503, pnas.org. Cape Town, South Africa, a semi-arid country, recently experienced a water shortage. From 2015 to 2018, unusually high temperatures contributed to higher rates of evaporation with less refresh due to low rainfall, contributing to decline in water reserves which fell to the point of emergency 10 “Cape Town’s Water is Running Out,” NASA Earth Observatory, January 14, 2018, earthobservatory.nasa.gov. —by January 2018, about 4.3 million residents of South Africa had endured years of constant restrictions on water use in both urban and agricultural settings. Area farmers recorded losses, and many agricultural workers lost their jobs. In the city, businesses were hit with steep water tariffs, jobs were lost, and residents had to ration water.

Under a scenario with warming 2.0°C above preindustrial levels by 2050, about 400 million people could be exposed to severe riverine or coastal flooding, which may breach existing defenses in place today. As the planet warms, patterns of flooding are likely to shift. This could lead to decreased flood depth in some regions and increases likely beyond the capacity of existing defenses in others.

Riverine floods can disrupt travel and supply chains, damage homes and infrastructure, and even lead to loss of life in extreme cases. The most vulnerable are likely to be disproportionately affected—fragile homes in informal coastal settlements are highly vulnerable to flood-related damages.

This analysis does not account for the secondary impacts of floods that may affect people. In rural areas, floods could cause the salinity of soil to increase, which in turn could damage agricultural productivity. Flooding could also make rural roads impassable, limiting residents’ ability to evacuate and their access to emergency response. Major floods sometimes lead to widespread impacts caused by population displacement, healthcare disruptions, food supply disruptions, drinking-water contamination, psychological trauma, and the spread of respiratory and insect-borne disease. 11 Christopher Ohl and Sue Tapsell, “Flooding and human health: The dangers posed are not always obvious,” British Medical Journal (BMJ) , 2000, Volume 321, Number 7270, pp. 1167–8, bmj.com; Shuili Du, C.B. Bhattacharya, and Sankar Sen, “Maximizing business returns to corporate social responsibility (CSR): The role of CSR communication,” International Journal of Management Reviews (IJMR) , 2010, Volume 12, Number 1, pp. 8–19, onlinelibrary.wiley.com. The severity of these impacts varies meaningfully across geographic and socioeconomic factors. 12 Roger Few et al., Floods, health and climate change: A strategic review , Tyndall Centre working paper, number 63, November 2004, unisdr.org.

People in lower-income countries tend to have higher levels of exposure to hazards

Our analysis suggests that exposure to climate hazards is unevenly distributed. Overall, a greater proportion of people living in lower-income countries are likely to be exposed to one or more climate hazards (Exhibit 4). Under a scenario with warming 2.0°C above preindustrial levels by 2050, more than half the total projected global population could be affected by a climate hazard. On the other hand, only 10 percent of the total population in high-income countries is likely to be exposed. That said, there could also be meaningful increases in overall exposure in developed nations. For example, based on 2050 population projections, about 160 million people in the United States—almost forty percent of the US population—could be exposed to at least one of the four climate hazards in a 2.0°C warming scenario by 2050.

In all, our analysis suggests that nearly twice as many highly vulnerable people (those estimated to have lower income and who may also have inadequate shelter, transportation, skills, or funds to protect themselves from climate risks) could be exposed to a climate hazard (Exhibit 5).

One of the implications of these findings is that certain countries are likely to be disproportionately affected. Two-thirds of the people who could be exposed to a climate hazard in a 2.0°C warming scenario by 2050 are concentrated in just ten countries. In two of these, Bangladesh and Pakistan, more than 90 percent of the population could be exposed to at least one climate hazard.

India’s vulnerability to climate hazards

Today, India accounts for more than 17 percent of the world’s population. In a scenario with 2.0°C warming above preindustrial levels by 2050, nearly 70 percent of India’s projected population, or 1.2 billion people, is likely to be exposed to one of the four climate hazards analyzed in this report, compared with the current exposure of nearly half of India’s population (0.7 billion). India could account for about 25 percent of the total global population likely to be exposed to a climate hazard under a 2.0°C warming scenario by 2050, relative to today.

Just as the absolute number of people likely to be exposed to hazards is increasing, so too is the proportion of people likely to be exposed to a severe climate hazard. Today, approximately one in six people in India are likely to be exposed to a severe climate hazard that puts lives and livelihoods at risk. Using 2050 population estimates and a scenario with 2.0°C warming above preindustrial levels by 2050, we estimate that this proportion could increase to nearly one in two people.

Severe heat stress is the primary culprit of severe climate hazard exposure, potentially affecting approximately 650 million residents of India by 2050 in the 2.0°C warming scenario, compared with just under ten million today (exhibit).

A vast number of people in India could also be exposed. Under a scenario with warming 2.0°C above preindustrial levels by 2050, nearly half of India’s projected population—approximately 850 million—could be exposed to a severe climate hazard. This equates to nearly one-quarter of the estimated 3.1 billion people likely to be exposed to a severe climate hazard globally by 2050 under a 2.0°C warming scenario (see sidebar “India’s vulnerability to climate hazards”).

Between now and 2050, population models 13 “Spatial Population Scenarios,” City University of New York and NCAR, updated August 2018, cgd.ucar.edu. project that the world could gain an additional 1.6 billion people, a proportion of whom are likely to be more exposed, more vulnerable, and less resilient to climate impacts.

For example, much of this population growth is likely to come from urban areas. Urbanization is likely to exacerbate the urban heat-island effect—in which human activities cause cities to be warmer than outlying areas—and humid heat waves could take an even greater toll. Urbanization is likely a driver in increased exposure of populations in coastal and riverine cities.

In India and other less developed economies, water stress is less of a climate problem and more of a socioeconomic problem. Our work and previous work on the topic has shown that increased water stress is mostly due to increases in demand—which is primarily driven by population growth in urban areas.

As labor shifts away from agriculture and other outdoor occupations toward indoor work, fewer people may be exposed to the effects of agricultural drought and heat stress. But on balance, many more people will likely be exposed to climate hazards by 2050 than today under either a 1.5°C or a 2.0°C warming scenario above preindustrial levels.

Many regions of the world are already experiencing elevated warming on a regional scale. It is estimated that 20 to 40 percent of today’s global population (depending on the temperature data set used) has experienced mean temperatures of at least 1.5°C higher than the preindustrial average in at least one season. 14 “Chapter 1: Framing and context,” Special report: Global warming of 1.5°C , International Panel on Climate Change (IPCC), 2018, ipcc.ch.

Mitigation will be critical to minimizing risk. However, much of the warming likely to occur in the next decade has already been “locked in” based on past emissions and physical inertia in the climate system. 15 H. Damon Matthews et al., “Focus on cumulative emissions, global carbon budgets, and the implications for climate mitigation targets,” Environmental Research Letters, January 2018, Volume 13, Number 1. Therefore, in addition to accelerating a path to lower emissions, leaders need to build resilience against climate events into their plans.

Around the world, there are examples of innovative ways to build resilience against climate hazards. For example, the regional government of Quintana Roo on Mexico’s Yucatán Peninsula insured its coral reefs in an arrangement with an insurance firm, providing incentives for the insurer to manage any degradation, 16 “World’s first coral reef insurance policy triggered by Hurricane Delta,” Nature Conservancy, December 7, 2020, nature.org. and a redesigned levee system put in place after Hurricane Katrina may have mitigated the worst effects of Hurricane Ida for the citizens of New Orleans. 17 Sarah McQuate, “UW engineer explains how the redesigned levee system in New Orleans helped mitigate the impact of Hurricane Ida,” University of Washington, September 2, 2021, washington.edu.

Nonstate actors may have particular opportunities to help build resilience. For instance, insurance companies may be in a position to encourage institutions to build resilience by offering insurance products for those that make the right investments. This can lower reliance on public money as the first source of funding for recovery from climate events. Civil-engineering companies can participate in innovative public–private partnerships to accelerate infrastructure projects. Companies in the agricultural and food sectors can help farmers around the world mitigate the effects that climate hazards can have on food production—for example, offers of financing can encourage farmers to make investments in resilience. The financial-services sector can get involved by offering better financing rates to borrowers who agree to disclose and reduce emissions and make progress on sustainability goals. And, among other actions, all companies can work to make their own operations and supply chains more resilient.

Accelerating this innovation, and scaling solutions that work quickly, could help us build resilience ahead of the most severe climate hazards.

Harry Bowcott is a senior partner in McKinsey’s London office, Lori Fomenko is a consultant in the Denver office, Alastair Hamilton is a partner in the London office, Mekala Krishnan is a partner at the McKinsey Global Institute (MGI) and a partner in the Boston office, Mihir Mysore is a partner in the Houston office, Alexis Trittipo is an associate partner in the New York office, and Oliver Walker is a director at Vivid Economics, part of McKinsey’s Sustainability Practice.

The authors wish to thank Shruti Badri, Riley Brady, Zach Bruick, Hauke Engel, Meredith Fish, Fabian Franzini, Kelly Kochanski, Romain Paniagua, Hamid Samandari, Humayun Tai, and Kasia Torkarska for their contributions to this article. They also wish to thank external adviser Guiling Wang and the Woodwell Climate Research Center.

Explore a career with us

Related articles.

Solving the net-zero equation: Nine requirements for a more orderly transition

Solving the net-zero equation: Nine requirements for a more orderly transition

How climate change affects people and populations: A research preview

How climate change affects people and populations: A research preview

Hurricane Research Division

research and analysis project

Hurricane Research

Dynamics and physics.

Visit the Dynamics & Physics Page

Observing Techniques

Visit the Observing Systems Page

Modeling and Prediction

Visit the Modeling & Prediction Page

Data Assimilation

Visit the Data Assimilation Page

Hurricane Impacts

Visit the Impacts Page

Featured Projects

Satellite images from the GOES satellite shows the Saharan Air Layer moving across Africa towards the Atlantic Basin.

Saharan Air Layer

Effects on Atlantic Tropical Cyclones

research and analysis project

Extratropical Transition

Forecasting Impacts in Midlatitudes

Doppler Winds Thumbnail. Photo Credit: NOAA.

Real-Time Doppler Winds

Analyzing & Delivering Data in “Real-Time”

Photo of the NASA Global Hawk in the hangar going out into a bright pink sunset. Photo Credit, NOAA.

Observing System Experiments

Using Aircraft Data

Hurricane Field Program

Experiments, flight plans and, operational maps and information for 2024, hurricane data, previous years data by storm.

View previous years' data by storm on our Data Page. Includes wind speed, temperature and humidity profiles, radar and visuals, and more.

Hurricane Model Viewer

Graphical products for experimental NOAA models and operational models.

2023 Field Program Data

All encompassing data suite includes wind speed, temperature and humidity profiles, radar and visuals, and more.

Research Capability & Expertise

Each WP-3D aircraft has three radars: nose, lower fuselage, and tail. The nose radar (a solid-state C-band radar with a 5° circular beam) is used strictly for flight safety and is not recorded for research purposes. The lower fuselage and tail radars are used for operational and research purposes. The G-IV aircraft has a nose and a tail radar too.

Expendables Dropwindsondes are deployed from the aircraft and drift down on a parachute measuring vertical profiles of pressure, temperature, humidity and wind as they fall. They are released from the both the WP-3D and G-IV aircraft over data-sparse oceanic regions. Airborne eXpendable BathyThermographs (AXBT) Airborne eXpendable Current Profilers (AXCP) Airborne eXpendable Conductivity Temperature and Depth probes (AXCTD) Drifting buoys

Oceanographic instruments may be deployed from the WP-3D aircraft either from external chutes using explosive cads or from an internal drop chute. They activate upon hitting the ocean surface and radio sea temperature, salinity, and current information back to computers aboard the aircraft.

Visit Expendables Page

Remote Sensing

Among the suite of airborne remote sensing instruments available on the WP-3D aircraft for the purpose of measuring surface winds in and around tropical cyclones are the Stepped Frequency Microwave Radiometer and the C-band scatterometer (C-SCAT). The C-SCAT conically scans the ocean surfaceobtaining backscatter measurements from 20° to 50° off nadir

C-Band Scatterometer (C-SCAT) The C-SCAT antenna is a microstrip phased array whose main lobe can be pointed at 20°, 30°, 40°, and 50° off nadir. The antenna is rotated in azimuth at 30 rpm. Thus, conical scans of the ocean surface are repeated every 2 s (0.25 km at 125 m/s ground speed).

Data assimilation is a technique by which numerical model data and observations are combined to obtain an analysis that best represents the state of the atmospheric phenomena of interest. At HRD, the focus is on the utilization of a wide range of observations for the state analysis of tropical systems and their near environments to study their structure and physical/dynamical processes, and to improve numerical forecasts. Research includes the development and application of a state-of-the-art ensemble-based data assimilation system (the Hurricane Ensemble Data Assimilation System – HEDAS) with the operational Hurricane Weather Research and Forecast model, using airborne, satellite and other observations. In parallel, Observing System Simulation Experiments are conducted for the systematic evaluation of proposed observational platforms geared toward the better sampling of tropical weather systems.

AOML developed the high-resolution HWRF model, the first 3 km-resolution regional model to be officially adopted and run operationally by the National Hurricane Center at the start of the 2012 hurricane season. This state-of-the-art research involved the following key elements:

High-resolution numerical model developments; Advancements to physical parameterizations for hurricane models based on observations; And above all, advancements in the basic understanding of hurricane processes.

In collaboration with NCEP‘s Environmental Modeling Center, and with the vital support of NOAA‘s Hurricane Forecast Improvement Project (HFIP), we are fully committed for years to come to the development and further advancement of NOAA‘s HWRF modeling system. A basin-sale version of the HWRF model is now in transition to operations. Visit the Hurricane Modeling and Prediction page to learn more.

News & Events

image of Hurricane Franklin early in cyclogenesis. It is over the Atlantic and does not have an eye.

Unveiling the innovative advancements in hurricane modeling

June 4, 2024

With an active hurricane season on the horizon, the need for reliable hurricane forecasting is at the forefront of our minds. Heightened sea surface temperatures, weakened vertical wind shear, and an enhanced West African monsoon are expected to contribute to the development of tropical cyclones in the Atlantic. To predict these developing storms, meteorologists employ models that rely on current observations and mathematical calculations to predict a storm’s behavior and track. These models are complex and utilize inputs from a variety of sources including historic, numeric, oceanic, and atmospheric data to generate their predictions. 

Developments in Hurricane Model Contributed to its Lasting Legacy

Innovative Flight Patterns Boost Hurricane Forecast Accuracy, NOAA Study Finds

12 Days of AOML Research

AOML awarded for exceptional science and communications accomplishments

Improvements in Forecasting, Weather, Floods and Hurricanes

Providing research to make forecasts better.

This overview report includes work on the Hurricane Analysis and Forecasting System (HAFS) , a set of moving, high-resolution nests around tropical cyclones in the global weather model, and the  AOML Hurricane Model Viewer .

research and analysis project

Featured Publication

High-Definition Hurricanes: Improving Forecasts with Storm-Following Nests: Image of the scientific paper

Alaka Jr, G. J., Zhang, X., & Gopalakrishnan, S. G. (2022). High-definition hurricanes: improving forecasts with storm-following nests.  Bulletin of the American Meteorological Society ,  103 (3), E680-E703.

Abstract: To forecast tropical cyclone (TC) intensity and structure changes with fidelity, numerical weather prediction models must be “high definition,” i.e., horizontal grid spacing ≤ 3 km, so that they permit clouds and convection and resolve sharp gradients of momentum and moisture in the eyewall and rainbands. Storm-following nests are computationally efficient at fine resolutions, providing a practical approach to improve TC intensity forecasts. Under the Hurricane Forecast Improvement Project, the operational Hurricane Weather Research and Forecasting (HWRF) system was developed to include telescopic, storm-following nests for a single TC per model integration.

Download Full Paper

High-Definition Hurricanes: Improving Forecasts with Storm-Following Nests

High-Definition Hurricanes: Improving Forecasts with Storm-Following Nests: Image of the scientific paper

Looking for scientific literature? Visit our Publication Database.

Dropsondes Measure Important Atmospheric Conditions

Airborne radar.

As our Hurricane Hunter Scientists make passes through the storm, they release small sensor packages on parachutes called dropsondes. These instruments provide measurements of temperature, pressure, humidity and wind as they descend through the storm. See more of our videos on YouTube.

Dropsonde Animation Image of the P-3. Photo Credit: NOAA.

Frequently Asked Questions about Hurricanes

Why don't nuclear weapons destroy hurricanes.

Radioactive fallout from such an operation would far outweigh the benefits and may not alter the storm.  Additionally, the amount of energy that a storm produces far outweighs the energy produced by one nuclear weapon.

How Much Energy is Released from a Hurricane?

The energy released from a hurricane can be explained in two ways: the total amount of energy released by the condensation of water droplets (latent heat), or the amount of kinetic energy generated to maintain the strong, swirling winds of a hurricane. The vast majority of the latent heat released is used to drive the convection of a storm, but the total energy released from condensation is 200 times the world-wide electrical generating capacity, or 6.0 x 10 14  watts per day. If you measure the total kinetic energy instead, it comes out to about 1.5 x 10 12  watts per day, or ½ of the world-wide electrical generating capacity. It would seem that although wind energy seems the most obvious energetic process, it is actually the latent release of heat that feeds a hurricane’s momentum.

What Causes Tropical Cyclones?

In addition to hurricane-favorable conditions such as temperature and humidity, many repeating atmospheric phenomenon contribute to causing and intensifying tropical cyclones. For example, African Easterly Waves are winds in the lower troposphere (ocean surface to 3 miles above) that travel from Africa at speeds of about 3mph westward as a result of the African Easterly Jet. These winds are seen from April until November. About 85% of intense hurricanes and about 60% of smaller storms have their origin in African Easterly waves.

The Saharan Air Layer is another significant seeding phenomenon for tropical storms.  It is a mass of dry, mineral-rich, dusty air that forms over the Sahara from late spring to early fall and moves over the tropical North Atlantic every 3-5 days at speeds of 22-55mph (10-25 meters per second). The air mass is 1-2 miles deep, exists in the lower troposphere, and can be as wide as the continental US. These air masses have significant moderating impacts on tropical cyclone intensity and formation because the dry, intense air can both deprive the storm of moisture and interfere with its convection by increasing the wind shear.

Many tropical cyclones form due to these larger scale atmospheric factors. Hurricanes that form fairly close in our basin are called Cape Verde hurricanes, named for the location where they are formed. Cape Verde origin hurricanes can be up to five per year, with an average of around two.

Why are Tropical Cyclones Always Worse on the Right Side?

If a hurricane is moving to the west, the right side would be to the north of the storm, if it is heading north, then the right side would be to the east of the storm. The movement of a hurricane can be broken into two parts- the spiral movement and its forward movement. If the hurricane is moving forward, the side of the spiral with winds parallel and facing forward in the direction of movement will go faster, because you are adding two velocities together. The side of the spiral parallel to the movement, but going in the opposite direction will be slower, because you must subtract the velocity moving away (backwards) from the forward velocity.

For example, a hurricane with 90mph winds moving at 10mph would have a 100mph wind speed on the right (forward-moving) side and 80 mph on the side with the backward motion.

How are Hurricanes Named?

During the 19th century, hurricane names were inspired by everything from saints to wives to unpopular politicians. In 1978, it was agreed that the National Hurricane Center would use alternating men and women’s names following the practice adopted by Australia’s bureau of Meteorology three years earlier in 1975.

Today, a list of potential names is published by the United Nations World Meteorological Organization for the Atlantic basin. These names extend into 2023, and the list repeats every seventh year. If a particularly damaging storm occurs, the name of that storm is retired. Storms retired in 2017 include Harvey, Irma, Maria, and Nate. If there are more storms than names on the list in a given season, the National Hurricane Center will name them using the Greek alphabet. Lastly, if a storm happens to move across basins, it keeps the original name. The only time it is renamed if it dissipates to a tropical disturbance and reforms.

With Hurricane Hunters Dr. Frank Marks & Commander Justin Kibbey

research and analysis project

Shirley Murillo

305.361.4509

| Shirley Murillo

Acting Director, Hurricane Research Division

research and analysis project

Aaron Poyer

301.427.9619

| Aaron Poyer

Acting Deputy Director, Hurricane Research Division

Our Field Photos

Scientists and Hurricane Hunters Paul Reasor and Robert Rogers preparing for flight into Hurricane Barry. Photo Credit, NOAA AOML.

The Global Hawk unmanned aircraft can fly continuously for 24 hours in a storm, collecting critical atmospheric data. Photo Credit, NOAA AOML.

Sunrise photo above the clouds from a hurricane hunter aircraft. Photo Credit, NOAA AOML.

Frank Marks Takes a Selfie with Ms. Piggy- The P3 aircraft have nicknames. This one is called Miss Piggy after one of Jim Henson’s The Muppets character. Photo Credit, NOAA AOML.

Photo of the P3 flying science lab on the tarmac ready for its next flight. Photo Credit, NOAA AOML.

Scientist drops scientific instruments into the hurricane below to take measurements that improve our forecasts. Photo Credit, NOAA AOML.

The eye of a hurricane as seen by a P3 aircraft. Photo Credit, NOAA AOML.

Hurricane researchers Paul Reasor (L) and Rob Rogers (R) are hard at work analyzing data during their flight into Tropical Storm Barry. Photo Credit: NOAA.

The Columbia University Journal of Global Health

Preliminary analysis of the disability landscape on Roatán, Honduras

Article sidebar.

research and analysis project

Main Article Content

Understanding the needs of persons with disabilities (PWDs) is vital to improving targeted healthcare and resources. The project seeks to assess the prevalence of disabilities, resources used, and care and treatment needs for PWDs on Roatán, Honduras. There is little to no prior research about disabilities on the island of Roatán, and few disability studies available in the country of Honduras. We surveyed 581 community members on the island of Roatán over a period of 6 weeks, on questions surrounding the disability status of themselves and their family members, and resources used by PWDs. Interviews were conducted with physicians, promotoras (community health promoters), and staff at the local Rehabilitation Clinic to assess the social experiences and resource needs of PWDs on the island. Of the 613 subjects obtained from our surveys, 258 (42%) had one or more disabilities. The most common disabilities were vision impairment, mobility impairment, and diabetes. 44.98% reported that the PWD did not visit any medical care facility to receive treatment. We found that there was a lack of disability-specific resources on Roatán, and no consistent definition of disability among community members and healthcare providers. Barriers to care include discrimination; caretaker burden; lack of medications, assistive devices, and specialists; and transportation. Our research highlights the need for more education on disabilities within communities, as well as increasing the amount and depth of disability-specific resources accessible on the island. This study was conducted at the request of Clinica Esperanza to determine how it could better support PWDs on Roatán, and the potential benefit of developing a day home for PWDs.

Article Details

Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License .

IMAGES

  1. Data Analysis Plan for Quantitative Research Analysis

    research and analysis project

  2. Research and analysis project

    research and analysis project

  3. How to Do a Research Project: Step-by-Step Process |Leverage Edu

    research and analysis project

  4. Research and analysis project

    research and analysis project

  5. RESEARCH AND ANALYSIS PROJECT

    research and analysis project

  6. Research Project Lifecycle

    research and analysis project

VIDEO

  1. Basic Analysis Project: Generate report

  2. Oxford Brookes University ACCA

  3. Adriana, MSc Aerospace Management student

  4. SciVal A tool for evidence based research planning Identifying trending topics

  5. Exploratory Data Analysis Overview

  6. How to choose Your OBU RAP Topic?

COMMENTS

  1. Research and Analysis Project (RAP) submission

    The Oxford Brookes Research and Analysis Project (RAP) submission fee for period 45 is £470 and period 46 is £470. The link will close at 12.00 (midnight) GMT on the day stated on the project submission dates page. Should you experience any difficulties uploading your Research and Analysis Project, ...

  2. Research Project

    Definition: Research Project is a planned and systematic investigation into a specific area of interest or problem, with the goal of generating new knowledge, insights, or solutions. It typically involves identifying a research question or hypothesis, designing a study to test it, collecting and analyzing data, and drawing conclusions based on ...

  3. A Beginner's Guide to Starting the Research Process

    Step 4: Create a research design. The research design is a practical framework for answering your research questions. It involves making decisions about the type of data you need, the methods you'll use to collect and analyze it, and the location and timescale of your research. There are often many possible paths you can take to answering ...

  4. What Is a Research Design

    Step 1: Consider your aims and approach. Step 2: Choose a type of research design. Step 3: Identify your population and sampling method. Step 4: Choose your data collection methods. Step 5: Plan your data collection procedures. Step 6: Decide on your data analysis strategies. Other interesting articles.

  5. Research Methods

    Research methods are specific procedures for collecting and analyzing data. Developing your research methods is an integral part of your research design. When planning your methods, there are two key decisions you will make. First, decide how you will collect data. Your methods depend on what type of data you need to answer your research question:

  6. Research Design

    Table of contents. Step 1: Consider your aims and approach. Step 2: Choose a type of research design. Step 3: Identify your population and sampling method. Step 4: Choose your data collection methods. Step 5: Plan your data collection procedures. Step 6: Decide on your data analysis strategies.

  7. PDF Submission of Research and Analysis Project

    Each element, Research Report, Skills and Learning Statement, Reference list, Excel spreadsheet, Presentation slides, and appendices (including resubmission statement, if relevant) are known as assignments. There are two stages in the submission process, uploading assignments (the various documents) and submitting the project.

  8. How to do a research project for your academic study

    A research project for students is an extended essay that presents a question or statement for analysis and evaluation. During a research project, you will present your own ideas and research on a subject alongside analysing existing knowledge. How to write a research report The next section covers the research project steps necessary to ...

  9. PDF Exemplars

    to anonymise them by removing references to the author, the company the project is on and any references they have used. Topic 8 is the most popular topic chosen by students, therefore we have reflected in the selection of Research and Analysis projects below. All references to exemplars refer to the same individual project as indexed below. Index:

  10. Introduction to Research Statistical Analysis: An Overview of the

    Introduction. Statistical analysis is necessary for any research project seeking to make quantitative conclusions. The following is a primer for research-based statistical analysis. It is intended to be a high-level overview of appropriate statistical testing, while not diving too deep into any specific methodology.

  11. Planning Qualitative Research: Design and Decision Making for New

    For students conducting their first qualitative research project, the choice of approach and subsequent alignment among problem, research questions, data collection, and data analysis can be particularly difficult. ... Indeed, there are other approaches for conducting qualitative research, including grounded theory, discourse analysis, feminist ...

  12. The Library: Research Skills: Analysing and Presenting Data

    Overview. Data analysis is an ongoing process that should occur throughout your research project. Suitable data-analysis methods must be selected when you write your research proposal. The nature of your data (i.e. quantitative or qualitative) will be influenced by your research design and purpose. The data will also influence the analysis ...

  13. How To Write an Analysis (With Examples and Tips)

    An effective analysis can be valuable for making informed decisions based on data and research. Writing an analysis can help you build support around a particular idea, cause or project. Knowing how to write one is a valuable skill for any career. In this article, you will learn what an analysis is, why it's an important tool to use in ...

  14. What is Project Analysis and Why it is Important?

    You analyze it! Consistent project analysis helps you make the right choices at the right time, leading you towards a more successful outcome and the highest possible ROI. Here we will talk about project analysis, its importance, the different types of project analysis, and lastly, how you can implement it using the right tools.

  15. Data Analysis in Research: Types & Methods

    Definition of research in data analysis: According to LeCompte and Schensul, research data analysis is a process used by researchers to reduce data to a story and interpret it to derive insights. The data analysis process helps reduce a large chunk of data into smaller fragments, which makes sense. Three essential things occur during the data ...

  16. The Research and Analysis Project (RAP)

    For example if you were proposing to submit a project in May 2019 for topic 19 the merger or acquisition would have to have taken place AFTER 1 JANUARY 2016. In addition to the Research & Analysis report you must also submit a 2,000 word (approx.) Skills and Learning Statement and Presentation slides (see below) at the same time.

  17. How to Write a Research Proposal

    Research proposal examples. Writing a research proposal can be quite challenging, but a good starting point could be to look at some examples. We've included a few for you below. Example research proposal #1: "A Conceptual Framework for Scheduling Constraint Management".

  18. How to write a research proposal?

    A proposal needs to show how your work fits into what is already known about the topic and what new paradigm will it add to the literature, while specifying the question that the research will answer, establishing its significance, and the implications of the answer. [ 2] The proposal must be capable of convincing the evaluation committee about ...

  19. Data Analysis in Research: Types & Methods

    Data analysis is a crucial step in the research process because it enables companies and researchers to glean insightful information from data. By using diverse analytical methodologies and approaches, scholars may reveal latent patterns, arrive at well-informed conclusions, and tackle intricate research inquiries.

  20. How to use and assess qualitative research methods

    How to conduct qualitative research? Given that qualitative research is characterised by flexibility, openness and responsivity to context, the steps of data collection and analysis are not as separate and consecutive as they tend to be in quantitative research [13, 14].As Fossey puts it: "sampling, data collection, analysis and interpretation are related to each other in a cyclical ...

  21. Project Analysis: A Guide for Project Managers and Teams

    Project analysis is the process of evaluating a project to determine its feasibility, constraints, and potential risks. This analysis helps you optimize resource allocation, devise contingency plans, improve project execution, and ultimately achieve your project goals promptly and efficiently.

  22. Iarpa

    Performer teams will conduct research and development to build systems that will be evaluated by an independent testing and evaluation (T&E) team. Independent T&E will ensure that REASON is effective in helping analysts discover valuable evidence, identify strengths and weaknesses in reasoning, and produce higher quality reports.

  23. Protecting people from a changing climate

    Our research consists of a global analysis of the exposure of people's lives and livelihoods to multiple hazards related to a changing climate. This analysis identifies people who are potentially vulnerable to four core climate hazards—heat stress, urban water stress, agricultural drought, and riverine and coastal flooding—even if warming ...

  24. Introduction to Data Analysis using Microsoft Excel

    In this project, you will learn the foundation of data analysis with Microsoft Excel using sales data from a sample company. You will learn how to use sorting and filtering tools to reorganize your data and access specific information about your data.

  25. The Beginner's Guide to Statistical Analysis

    Step 1: Write your hypotheses and plan your research design. To collect valid data for statistical analysis, you first need to specify your hypotheses and plan out your research design. Writing statistical hypotheses. The goal of research is often to investigate a relationship between variables within a population. You start with a prediction ...

  26. Hurricane Research Division

    AOML's hurricane research division has been in operation longer than AOML itself, and is developing next generation hurricane modelling. ... Data assimilation is a technique by which numerical model data and observations are combined to obtain an analysis that best represents the state of the atmospheric phenomena of interest. ... and with the ...

  27. How To Start A Business In 11 Steps (2024 Guide)

    Primary Research. The first stage of any competition study is primary research, which entails obtaining data directly from potential customers rather than basing your conclusions on past data ...

  28. Quantifying Schedule Delay Risk in Construction Projects: A Data‐Driven

    1. Introduction. Progress management is a critical composing part of construction project management. Any improper management will result in project delays, cost overruns, and quality issues, which would lead to disappointment for clients, financial loss for contractors, and potential safety hazards for workers and the public [].Therefore, it is particularly necessary to confirm whether the ...

  29. What Is a Research Methodology?

    What Is a Research Methodology? | Steps & Tips. Published on August 25, 2022 by Shona McCombes and Tegan George. Revised on November 20, 2023. Your research methodology discusses and explains the data collection and analysis methods you used in your research. A key part of your thesis, dissertation, or research paper, the methodology chapter explains what you did and how you did it, allowing ...

  30. Preliminary analysis of the disability landscape on Roatán, Honduras

    Understanding the needs of persons with disabilities (PWDs) is vital to improving targeted healthcare and resources. The project seeks to assess the prevalence of disabilities, resources used, and care and treatment needs for PWDs on Roatán, Honduras. There is little to no prior research about disabilities on the island of Roatán, and few disability studies available in the country of Honduras.