University of Derby

Literature Reviews: systematic searching at various levels

  • for assignments
  • for dissertations / theses
  • Search strategy and searching
  • Boolean Operators
  • Search strategy template
  • Screening & critiquing
  • Citation Searching
  • Google Scholar (with Lean Library)
  • Resources for literature reviews
  • Adding a referencing style to EndNote
  • Exporting from different databases

PRISMA Flow Diagram

  • Grey Literature
  • What is the PRISMA Flow Diagram?
  • How should I use it?
  • When should I use it?
  • PRISMA Links

The PRISMA Flow Diagram is a tool that can be used to record different stages of the literature search process--across multiple resources--and clearly show how a researcher went from, 'These are the databases I searched for my terms', to, 'These are the papers I'm going to talk about'.

PRISMA is not inflexible; it can be modified to suit the research needs of different people and, indeed, if you did a Google images search for the flow diagram you would see many different versions of the diagram being used. It's a good idea to have a look at a couple of those examples, and also to have a look at a couple of the articles on the PRISMA website to see how it has--and can--be used.

The PRISMA 2020 Statement was published in 2021. It consists of a  checklist  and a  flow diagram , and is intended to be accompanied by the PRISMA 2020 Explanation and Elaboration document.

In order to encourage dissemination of the PRISMA 2020 Statement, it has been published in several journals.

  • How to use the PRISMA Flow Diagram for literature reviews A PDF [3.81MB] of the PowerPoint used to create the video. Each slide that has notes has a callout icon on the top right of the page which can be toggled on or off to make the notes visible.

There is also a PowerPoint version of the document but the file size is too large to upload here.

If you would like a copy, please email the Academic Librarians' mailbox from your university account to ask for it to be sent to you.

This is an example of how you  could  fill in the PRISMA flow diagram when conducting a new review. It is not a hard and fast rule but it should give you an idea of how you can use it.

For more detailed information, please have a look at this article:

Page, M.J., McKenzie, J.E., Bossuyt, P.M., Boutron, I., Hoffmann, T.C., Mulrow, C.D., Shamseer, L., Tetzlaff, J.M., Akl, E.A., Brennan, S.E., Chou, R., Glanville, J., Grimshaw, J.M., Hróbjartsson, A., Lalu, M.M., Li, T., Loder, E.W., Mayo-Wilson, E., McDonald, S., McGuinness, L.A., Stewart, L.A., Thomas, J., Tricco, A.C., Welch, V.A., Whiting,P. & Moher, D. (2021) 'The PRISMA 2020 statement: an updated guideline for reporting systematic reviews',  BMJ 372:(71). doi: 10.1136/bmj.n71 .

  • Example of PRISMA 2020 diagram This is an example of *one* of the PRISMA 2020 flow diagrams you can use when reporting on your research process. There is more than one form that you can use so for other forms and advice please look at the PRISMA website for full details.

Start using the flow diagram as you start searching the databases you've decided upon. 

Make sure that you record the number of results that you found per database (before removing any duplicates) as per the filled in example. You can also do a Google images search for the PRISMA flow diagram to see the different ways in which people have used them to express their search processes.

  • Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) PRISMA is an evidence-based minimum set of items for reporting in systematic reviews and meta-analyses. PRISMA focuses on the reporting of reviews evaluating randomized trials, but can also be used as a basis for reporting systematic reviews of other types of research, particularly evaluations of interventions.
  • Prisma Flow Diagram This link will take you to downloadable Word and PDF copies of the flow diagram. These are modifiable and act as a starting point for you to record the process you engaged in from first search to the papers you ultimately discuss in your work. more... less... Do an image search on the internet for the flow diagram and you will be able to see all the different ways that people have modified the diagram to suit their personal research needs.

You can access the various checklists via the Equator website and the articles explaining PRISMA and its various extensions are available via PubMed.

Page, M.J., McKenzie, J.E., Bossuyt, P.M., Boutron, I., Hoffmann, T.C., Mulrow, C.D., Shamseer, L., Tetzlaff, J.M., Akl, E.A., Brennan, S.E., Chou, R., Glanville, J., Grimshaw, J.M., Hróbjartsson, A., Lalu, M.M., Li, T., Loder, E.W., Mayo-Wilson, E., McDonald, S., McGuinness, L.A., Stewart, L.A., Thomas, J., Tricco, A.C., Welch, V.A., Whiting, P., & Moher, D. (2021) ' The PRISMA 2020 statement: an updated guideline for reporting systematic reviews,'  BMJ .  Mar 29; 372:n71. doi: 10.1136/bmj.n71 .

Page, M.J., Moher, D., Bossuyt, P.M., Boutron, I., Hoffmann, T.C., Mulrow, C.D., Shamseer, L., Tetzlaff, J.M., Akl, E.A., Brennan, S.E., Chou, R., Glanville, J., Grimshaw, J.M., Hróbjartsson, A., Lalu, M.M., Li, T., Loder, E.W., Mayo-Wilson, E., McDonald, S., McGuinness, L.A., Stewart, L.A., Thomas, J., Tricco, A.C., Welch, V.A., Whiting, P., & McKenzie, J.E. (2021)  'PRISMA 2020 explanation and elaboration: updated guidance and exemplars for reporting systematic reviews',  BMJ, Mar 29; 372:n160. doi: 10.1136/bmj.n160 .

Page, M.J., McKenzie, J.E., Bossuyt, P.M., Boutron, I., Hoffmann, T.C., Mulrow, C.D., Shamseer, L., Tetzlaff, J.M., Akl, E.A., Brennan, S.E., Chou, R., Glanville, J., Grimshaw, J.M., Hróbjartsson, A., Lalu, M.M., Li, T., Loder, E.W., Mayo-Wilson, E., McDonald, S., McGuinness, L.A., Stewart, L.A., Thomas, J., Tricco, A.C., Welch, V.A., Whiting, P., & Moher, D. (2021) ' The PRISMA 2020 statement: An updated guideline for reporting systematic reviews,'  Journal of Clinical Epidemiology, June; 134:178-189. doi: 10.1016/j.jclinepi.2021.03.001 . 

  • << Previous: Exporting from different databases
  • Next: Grey Literature >>
  • Last Updated: Apr 12, 2024 11:57 AM
  • URL: https://libguides.derby.ac.uk/literature-reviews
  • - Google Chrome

Intended for healthcare professionals

  • Access provided by Google Indexer
  • My email alerts
  • BMA member login
  • Username * Password * Forgot your log in details? Need to activate BMA Member Log In Log in via OpenAthens Log in via your institution

Home

Search form

  • Advanced search
  • Search responses
  • Search blogs
  • The PRISMA 2020...

The PRISMA 2020 statement: an updated guideline for reporting systematic reviews

PRISMA 2020 explanation and elaboration: updated guidance and exemplars for reporting systematic reviews

  • Related content
  • Peer review
  • Matthew J Page , senior research fellow 1 ,
  • Joanne E McKenzie , associate professor 1 ,
  • Patrick M Bossuyt , professor 2 ,
  • Isabelle Boutron , professor 3 ,
  • Tammy C Hoffmann , professor 4 ,
  • Cynthia D Mulrow , professor 5 ,
  • Larissa Shamseer , doctoral student 6 ,
  • Jennifer M Tetzlaff , research product specialist 7 ,
  • Elie A Akl , professor 8 ,
  • Sue E Brennan , senior research fellow 1 ,
  • Roger Chou , professor 9 ,
  • Julie Glanville , associate director 10 ,
  • Jeremy M Grimshaw , professor 11 ,
  • Asbjørn Hróbjartsson , professor 12 ,
  • Manoj M Lalu , associate scientist and assistant professor 13 ,
  • Tianjing Li , associate professor 14 ,
  • Elizabeth W Loder , professor 15 ,
  • Evan Mayo-Wilson , associate professor 16 ,
  • Steve McDonald , senior research fellow 1 ,
  • Luke A McGuinness , research associate 17 ,
  • Lesley A Stewart , professor and director 18 ,
  • James Thomas , professor 19 ,
  • Andrea C Tricco , scientist and associate professor 20 ,
  • Vivian A Welch , associate professor 21 ,
  • Penny Whiting , associate professor 17 ,
  • David Moher , director and professor 22
  • 1 School of Public Health and Preventive Medicine, Monash University, Melbourne, Australia
  • 2 Department of Clinical Epidemiology, Biostatistics and Bioinformatics, Amsterdam University Medical Centres, University of Amsterdam, Amsterdam, Netherlands
  • 3 Université de Paris, Centre of Epidemiology and Statistics (CRESS), Inserm, F 75004 Paris, France
  • 4 Institute for Evidence-Based Healthcare, Faculty of Health Sciences and Medicine, Bond University, Gold Coast, Australia
  • 5 University of Texas Health Science Center at San Antonio, San Antonio, Texas, USA; Annals of Internal Medicine
  • 6 Knowledge Translation Program, Li Ka Shing Knowledge Institute, Toronto, Canada; School of Epidemiology and Public Health, Faculty of Medicine, University of Ottawa, Ottawa, Canada
  • 7 Evidence Partners, Ottawa, Canada
  • 8 Clinical Research Institute, American University of Beirut, Beirut, Lebanon; Department of Health Research Methods, Evidence, and Impact, McMaster University, Hamilton, Ontario, Canada
  • 9 Department of Medical Informatics and Clinical Epidemiology, Oregon Health & Science University, Portland, Oregon, USA
  • 10 York Health Economics Consortium (YHEC Ltd), University of York, York, UK
  • 11 Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, Canada; School of Epidemiology and Public Health, University of Ottawa, Ottawa, Canada; Department of Medicine, University of Ottawa, Ottawa, Canada
  • 12 Centre for Evidence-Based Medicine Odense (CEBMO) and Cochrane Denmark, Department of Clinical Research, University of Southern Denmark, Odense, Denmark; Open Patient data Exploratory Network (OPEN), Odense University Hospital, Odense, Denmark
  • 13 Department of Anesthesiology and Pain Medicine, The Ottawa Hospital, Ottawa, Canada; Clinical Epidemiology Program, Blueprint Translational Research Group, Ottawa Hospital Research Institute, Ottawa, Canada; Regenerative Medicine Program, Ottawa Hospital Research Institute, Ottawa, Canada
  • 14 Department of Ophthalmology, School of Medicine, University of Colorado Denver, Denver, Colorado, United States; Department of Epidemiology, Johns Hopkins Bloomberg School of Public Health, Baltimore, Maryland, USA
  • 15 Division of Headache, Department of Neurology, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts, USA; Head of Research, The BMJ , London, UK
  • 16 Department of Epidemiology and Biostatistics, Indiana University School of Public Health-Bloomington, Bloomington, Indiana, USA
  • 17 Population Health Sciences, Bristol Medical School, University of Bristol, Bristol, UK
  • 18 Centre for Reviews and Dissemination, University of York, York, UK
  • 19 EPPI-Centre, UCL Social Research Institute, University College London, London, UK
  • 20 Li Ka Shing Knowledge Institute of St. Michael's Hospital, Unity Health Toronto, Toronto, Canada; Epidemiology Division of the Dalla Lana School of Public Health and the Institute of Health Management, Policy, and Evaluation, University of Toronto, Toronto, Canada; Queen's Collaboration for Health Care Quality Joanna Briggs Institute Centre of Excellence, Queen's University, Kingston, Canada
  • 21 Methods Centre, Bruyère Research Institute, Ottawa, Ontario, Canada; School of Epidemiology and Public Health, Faculty of Medicine, University of Ottawa, Ottawa, Canada
  • 22 Centre for Journalology, Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, Canada; School of Epidemiology and Public Health, Faculty of Medicine, University of Ottawa, Ottawa, Canada
  • Correspondence to: M J Page matthew.page{at}monash.edu
  • Accepted 4 January 2021

The Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) statement, published in 2009, was designed to help systematic reviewers transparently report why the review was done, what the authors did, and what they found. Over the past decade, advances in systematic review methodology and terminology have necessitated an update to the guideline. The PRISMA 2020 statement replaces the 2009 statement and includes new reporting guidance that reflects advances in methods to identify, select, appraise, and synthesise studies. The structure and presentation of the items have been modified to facilitate implementation. In this article, we present the PRISMA 2020 27-item checklist, an expanded checklist that details reporting recommendations for each item, the PRISMA 2020 abstract checklist, and the revised flow diagrams for original and updated reviews.

Systematic reviews serve many critical roles. They can provide syntheses of the state of knowledge in a field, from which future research priorities can be identified; they can address questions that otherwise could not be answered by individual studies; they can identify problems in primary research that should be rectified in future studies; and they can generate or evaluate theories about how or why phenomena occur. Systematic reviews therefore generate various types of knowledge for different users of reviews (such as patients, healthcare providers, researchers, and policy makers). 1 2 To ensure a systematic review is valuable to users, authors should prepare a transparent, complete, and accurate account of why the review was done, what they did (such as how studies were identified and selected) and what they found (such as characteristics of contributing studies and results of meta-analyses). Up-to-date reporting guidance facilitates authors achieving this. 3

The Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) statement published in 2009 (hereafter referred to as PRISMA 2009) 4 5 6 7 8 9 10 is a reporting guideline designed to address poor reporting of systematic reviews. 11 The PRISMA 2009 statement comprised a checklist of 27 items recommended for reporting in systematic reviews and an “explanation and elaboration” paper 12 13 14 15 16 providing additional reporting guidance for each item, along with exemplars of reporting. The recommendations have been widely endorsed and adopted, as evidenced by its co-publication in multiple journals, citation in over 60 000 reports (Scopus, August 2020), endorsement from almost 200 journals and systematic review organisations, and adoption in various disciplines. Evidence from observational studies suggests that use of the PRISMA 2009 statement is associated with more complete reporting of systematic reviews, 17 18 19 20 although more could be done to improve adherence to the guideline. 21

Many innovations in the conduct of systematic reviews have occurred since publication of the PRISMA 2009 statement. For example, technological advances have enabled the use of natural language processing and machine learning to identify relevant evidence, 22 23 24 methods have been proposed to synthesise and present findings when meta-analysis is not possible or appropriate, 25 26 27 and new methods have been developed to assess the risk of bias in results of included studies. 28 29 Evidence on sources of bias in systematic reviews has accrued, culminating in the development of new tools to appraise the conduct of systematic reviews. 30 31 Terminology used to describe particular review processes has also evolved, as in the shift from assessing “quality” to assessing “certainty” in the body of evidence. 32 In addition, the publishing landscape has transformed, with multiple avenues now available for registering and disseminating systematic review protocols, 33 34 disseminating reports of systematic reviews, and sharing data and materials, such as preprint servers and publicly accessible repositories. To capture these advances in the reporting of systematic reviews necessitated an update to the PRISMA 2009 statement.

Summary points

To ensure a systematic review is valuable to users, authors should prepare a transparent, complete, and accurate account of why the review was done, what they did, and what they found

The PRISMA 2020 statement provides updated reporting guidance for systematic reviews that reflects advances in methods to identify, select, appraise, and synthesise studies

The PRISMA 2020 statement consists of a 27-item checklist, an expanded checklist that details reporting recommendations for each item, the PRISMA 2020 abstract checklist, and revised flow diagrams for original and updated reviews

We anticipate that the PRISMA 2020 statement will benefit authors, editors, and peer reviewers of systematic reviews, and different users of reviews, including guideline developers, policy makers, healthcare providers, patients, and other stakeholders

Development of PRISMA 2020

A complete description of the methods used to develop PRISMA 2020 is available elsewhere. 35 We identified PRISMA 2009 items that were often reported incompletely by examining the results of studies investigating the transparency of reporting of published reviews. 17 21 36 37 We identified possible modifications to the PRISMA 2009 statement by reviewing 60 documents providing reporting guidance for systematic reviews (including reporting guidelines, handbooks, tools, and meta-research studies). 38 These reviews of the literature were used to inform the content of a survey with suggested possible modifications to the 27 items in PRISMA 2009 and possible additional items. Respondents were asked whether they believed we should keep each PRISMA 2009 item as is, modify it, or remove it, and whether we should add each additional item. Systematic review methodologists and journal editors were invited to complete the online survey (110 of 220 invited responded). We discussed proposed content and wording of the PRISMA 2020 statement, as informed by the review and survey results, at a 21-member, two-day, in-person meeting in September 2018 in Edinburgh, Scotland. Throughout 2019 and 2020, we circulated an initial draft and five revisions of the checklist and explanation and elaboration paper to co-authors for feedback. In April 2020, we invited 22 systematic reviewers who had expressed interest in providing feedback on the PRISMA 2020 checklist to share their views (via an online survey) on the layout and terminology used in a preliminary version of the checklist. Feedback was received from 15 individuals and considered by the first author, and any revisions deemed necessary were incorporated before the final version was approved and endorsed by all co-authors.

The PRISMA 2020 statement

Scope of the guideline.

The PRISMA 2020 statement has been designed primarily for systematic reviews of studies that evaluate the effects of health interventions, irrespective of the design of the included studies. However, the checklist items are applicable to reports of systematic reviews evaluating other interventions (such as social or educational interventions), and many items are applicable to systematic reviews with objectives other than evaluating interventions (such as evaluating aetiology, prevalence, or prognosis). PRISMA 2020 is intended for use in systematic reviews that include synthesis (such as pairwise meta-analysis or other statistical synthesis methods) or do not include synthesis (for example, because only one eligible study is identified). The PRISMA 2020 items are relevant for mixed-methods systematic reviews (which include quantitative and qualitative studies), but reporting guidelines addressing the presentation and synthesis of qualitative data should also be consulted. 39 40 PRISMA 2020 can be used for original systematic reviews, updated systematic reviews, or continually updated (“living”) systematic reviews. However, for updated and living systematic reviews, there may be some additional considerations that need to be addressed. Where there is relevant content from other reporting guidelines, we reference these guidelines within the items in the explanation and elaboration paper 41 (such as PRISMA-Search 42 in items 6 and 7, Synthesis without meta-analysis (SWiM) reporting guideline 27 in item 13d). Box 1 includes a glossary of terms used throughout the PRISMA 2020 statement.

Glossary of terms

Systematic review —A review that uses explicit, systematic methods to collate and synthesise findings of studies that address a clearly formulated question 43

Statistical synthesis —The combination of quantitative results of two or more studies. This encompasses meta-analysis of effect estimates (described below) and other methods, such as combining P values, calculating the range and distribution of observed effects, and vote counting based on the direction of effect (see McKenzie and Brennan 25 for a description of each method)

Meta-analysis of effect estimates —A statistical technique used to synthesise results when study effect estimates and their variances are available, yielding a quantitative summary of results 25

Outcome —An event or measurement collected for participants in a study (such as quality of life, mortality)

Result —The combination of a point estimate (such as a mean difference, risk ratio, or proportion) and a measure of its precision (such as a confidence/credible interval) for a particular outcome

Report —A document (paper or electronic) supplying information about a particular study. It could be a journal article, preprint, conference abstract, study register entry, clinical study report, dissertation, unpublished manuscript, government report, or any other document providing relevant information

Record —The title or abstract (or both) of a report indexed in a database or website (such as a title or abstract for an article indexed in Medline). Records that refer to the same report (such as the same journal article) are “duplicates”; however, records that refer to reports that are merely similar (such as a similar abstract submitted to two different conferences) should be considered unique.

Study —An investigation, such as a clinical trial, that includes a defined group of participants and one or more interventions and outcomes. A “study” might have multiple reports. For example, reports could include the protocol, statistical analysis plan, baseline characteristics, results for the primary outcome, results for harms, results for secondary outcomes, and results for additional mediator and moderator analyses

PRISMA 2020 is not intended to guide systematic review conduct, for which comprehensive resources are available. 43 44 45 46 However, familiarity with PRISMA 2020 is useful when planning and conducting systematic reviews to ensure that all recommended information is captured. PRISMA 2020 should not be used to assess the conduct or methodological quality of systematic reviews; other tools exist for this purpose. 30 31 Furthermore, PRISMA 2020 is not intended to inform the reporting of systematic review protocols, for which a separate statement is available (PRISMA for Protocols (PRISMA-P) 2015 statement 47 48 ). Finally, extensions to the PRISMA 2009 statement have been developed to guide reporting of network meta-analyses, 49 meta-analyses of individual participant data, 50 systematic reviews of harms, 51 systematic reviews of diagnostic test accuracy studies, 52 and scoping reviews 53 ; for these types of reviews we recommend authors report their review in accordance with the recommendations in PRISMA 2020 along with the guidance specific to the extension.

How to use PRISMA 2020

The PRISMA 2020 statement (including the checklists, explanation and elaboration, and flow diagram) replaces the PRISMA 2009 statement, which should no longer be used. Box 2 summarises noteworthy changes from the PRISMA 2009 statement. The PRISMA 2020 checklist includes seven sections with 27 items, some of which include sub-items ( table 1 ). A checklist for journal and conference abstracts for systematic reviews is included in PRISMA 2020. This abstract checklist is an update of the 2013 PRISMA for Abstracts statement, 54 reflecting new and modified content in PRISMA 2020 ( table 2 ). A template PRISMA flow diagram is provided, which can be modified depending on whether the systematic review is original or updated ( fig 1 ).

Noteworthy changes to the PRISMA 2009 statement

Inclusion of the abstract reporting checklist within PRISMA 2020 (see item #2 and table 2 ).

Movement of the ‘Protocol and registration’ item from the start of the Methods section of the checklist to a new Other section, with addition of a sub-item recommending authors describe amendments to information provided at registration or in the protocol (see item #24a-24c).

Modification of the ‘Search’ item to recommend authors present full search strategies for all databases, registers and websites searched, not just at least one database (see item #7).

Modification of the ‘Study selection’ item in the Methods section to emphasise the reporting of how many reviewers screened each record and each report retrieved, whether they worked independently, and if applicable, details of automation tools used in the process (see item #8).

Addition of a sub-item to the ‘Data items’ item recommending authors report how outcomes were defined, which results were sought, and methods for selecting a subset of results from included studies (see item #10a).

Splitting of the ‘Synthesis of results’ item in the Methods section into six sub-items recommending authors describe: the processes used to decide which studies were eligible for each synthesis; any methods required to prepare the data for synthesis; any methods used to tabulate or visually display results of individual studies and syntheses; any methods used to synthesise results; any methods used to explore possible causes of heterogeneity among study results (such as subgroup analysis, meta-regression); and any sensitivity analyses used to assess robustness of the synthesised results (see item #13a-13f).

Addition of a sub-item to the ‘Study selection’ item in the Results section recommending authors cite studies that might appear to meet the inclusion criteria, but which were excluded, and explain why they were excluded (see item #16b).

Splitting of the ‘Synthesis of results’ item in the Results section into four sub-items recommending authors: briefly summarise the characteristics and risk of bias among studies contributing to the synthesis; present results of all statistical syntheses conducted; present results of any investigations of possible causes of heterogeneity among study results; and present results of any sensitivity analyses (see item #20a-20d).

Addition of new items recommending authors report methods for and results of an assessment of certainty (or confidence) in the body of evidence for an outcome (see items #15 and #22).

Addition of a new item recommending authors declare any competing interests (see item #26).

Addition of a new item recommending authors indicate whether data, analytic code and other materials used in the review are publicly available and if so, where they can be found (see item #27).

PRISMA 2020 item checklist

  • View inline

PRISMA 2020 for Abstracts checklist*

Fig 1

PRISMA 2020 flow diagram template for systematic reviews. The new design is adapted from flow diagrams proposed by Boers, 55 Mayo-Wilson et al. 56 and Stovold et al. 57 The boxes in grey should only be completed if applicable; otherwise they should be removed from the flow diagram. Note that a “report” could be a journal article, preprint, conference abstract, study register entry, clinical study report, dissertation, unpublished manuscript, government report or any other document providing relevant information.

  • Download figure
  • Open in new tab
  • Download powerpoint

We recommend authors refer to PRISMA 2020 early in the writing process, because prospective consideration of the items may help to ensure that all the items are addressed. To help keep track of which items have been reported, the PRISMA statement website ( http://www.prisma-statement.org/ ) includes fillable templates of the checklists to download and complete (also available in the data supplement on bmj.com). We have also created a web application that allows users to complete the checklist via a user-friendly interface 58 (available at https://prisma.shinyapps.io/checklist/ and adapted from the Transparency Checklist app 59 ). The completed checklist can be exported to Word or PDF. Editable templates of the flow diagram can also be downloaded from the PRISMA statement website.

We have prepared an updated explanation and elaboration paper, in which we explain why reporting of each item is recommended and present bullet points that detail the reporting recommendations (which we refer to as elements). 41 The bullet-point structure is new to PRISMA 2020 and has been adopted to facilitate implementation of the guidance. 60 61 An expanded checklist, which comprises an abridged version of the elements presented in the explanation and elaboration paper, with references and some examples removed, is available in the data supplement on bmj.com. Consulting the explanation and elaboration paper is recommended if further clarity or information is required.

Journals and publishers might impose word and section limits, and limits on the number of tables and figures allowed in the main report. In such cases, if the relevant information for some items already appears in a publicly accessible review protocol, referring to the protocol may suffice. Alternatively, placing detailed descriptions of the methods used or additional results (such as for less critical outcomes) in supplementary files is recommended. Ideally, supplementary files should be deposited to a general-purpose or institutional open-access repository that provides free and permanent access to the material (such as Open Science Framework, Dryad, figshare). A reference or link to the additional information should be included in the main report. Finally, although PRISMA 2020 provides a template for where information might be located, the suggested location should not be seen as prescriptive; the guiding principle is to ensure the information is reported.

Use of PRISMA 2020 has the potential to benefit many stakeholders. Complete reporting allows readers to assess the appropriateness of the methods, and therefore the trustworthiness of the findings. Presenting and summarising characteristics of studies contributing to a synthesis allows healthcare providers and policy makers to evaluate the applicability of the findings to their setting. Describing the certainty in the body of evidence for an outcome and the implications of findings should help policy makers, managers, and other decision makers formulate appropriate recommendations for practice or policy. Complete reporting of all PRISMA 2020 items also facilitates replication and review updates, as well as inclusion of systematic reviews in overviews (of systematic reviews) and guidelines, so teams can leverage work that is already done and decrease research waste. 36 62 63

We updated the PRISMA 2009 statement by adapting the EQUATOR Network’s guidance for developing health research reporting guidelines. 64 We evaluated the reporting completeness of published systematic reviews, 17 21 36 37 reviewed the items included in other documents providing guidance for systematic reviews, 38 surveyed systematic review methodologists and journal editors for their views on how to revise the original PRISMA statement, 35 discussed the findings at an in-person meeting, and prepared this document through an iterative process. Our recommendations are informed by the reviews and survey conducted before the in-person meeting, theoretical considerations about which items facilitate replication and help users assess the risk of bias and applicability of systematic reviews, and co-authors’ experience with authoring and using systematic reviews.

Various strategies to increase the use of reporting guidelines and improve reporting have been proposed. They include educators introducing reporting guidelines into graduate curricula to promote good reporting habits of early career scientists 65 ; journal editors and regulators endorsing use of reporting guidelines 18 ; peer reviewers evaluating adherence to reporting guidelines 61 66 ; journals requiring authors to indicate where in their manuscript they have adhered to each reporting item 67 ; and authors using online writing tools that prompt complete reporting at the writing stage. 60 Multi-pronged interventions, where more than one of these strategies are combined, may be more effective (such as completion of checklists coupled with editorial checks). 68 However, of 31 interventions proposed to increase adherence to reporting guidelines, the effects of only 11 have been evaluated, mostly in observational studies at high risk of bias due to confounding. 69 It is therefore unclear which strategies should be used. Future research might explore barriers and facilitators to the use of PRISMA 2020 by authors, editors, and peer reviewers, designing interventions that address the identified barriers, and evaluating those interventions using randomised trials. To inform possible revisions to the guideline, it would also be valuable to conduct think-aloud studies 70 to understand how systematic reviewers interpret the items, and reliability studies to identify items where there is varied interpretation of the items.

We encourage readers to submit evidence that informs any of the recommendations in PRISMA 2020 (via the PRISMA statement website: http://www.prisma-statement.org/ ). To enhance accessibility of PRISMA 2020, several translations of the guideline are under way (see available translations at the PRISMA statement website). We encourage journal editors and publishers to raise awareness of PRISMA 2020 (for example, by referring to it in journal “Instructions to authors”), endorsing its use, advising editors and peer reviewers to evaluate submitted systematic reviews against the PRISMA 2020 checklists, and making changes to journal policies to accommodate the new reporting recommendations. We recommend existing PRISMA extensions 47 49 50 51 52 53 71 72 be updated to reflect PRISMA 2020 and advise developers of new PRISMA extensions to use PRISMA 2020 as the foundation document.

We anticipate that the PRISMA 2020 statement will benefit authors, editors, and peer reviewers of systematic reviews, and different users of reviews, including guideline developers, policy makers, healthcare providers, patients, and other stakeholders. Ultimately, we hope that uptake of the guideline will lead to more transparent, complete, and accurate reporting of systematic reviews, thus facilitating evidence based decision making.

Acknowledgments

We dedicate this paper to the late Douglas G Altman and Alessandro Liberati, whose contributions were fundamental to the development and implementation of the original PRISMA statement.

We thank the following contributors who completed the survey to inform discussions at the development meeting: Xavier Armoiry, Edoardo Aromataris, Ana Patricia Ayala, Ethan M Balk, Virginia Barbour, Elaine Beller, Jesse A Berlin, Lisa Bero, Zhao-Xiang Bian, Jean Joel Bigna, Ferrán Catalá-López, Anna Chaimani, Mike Clarke, Tammy Clifford, Ioana A Cristea, Miranda Cumpston, Sofia Dias, Corinna Dressler, Ivan D Florez, Joel J Gagnier, Chantelle Garritty, Long Ge, Davina Ghersi, Sean Grant, Gordon Guyatt, Neal R Haddaway, Julian PT Higgins, Sally Hopewell, Brian Hutton, Jamie J Kirkham, Jos Kleijnen, Julia Koricheva, Joey SW Kwong, Toby J Lasserson, Julia H Littell, Yoon K Loke, Malcolm R Macleod, Chris G Maher, Ana Marušic, Dimitris Mavridis, Jessie McGowan, Matthew DF McInnes, Philippa Middleton, Karel G Moons, Zachary Munn, Jane Noyes, Barbara Nußbaumer-Streit, Donald L Patrick, Tatiana Pereira-Cenci, Ba’ Pham, Bob Phillips, Dawid Pieper, Michelle Pollock, Daniel S Quintana, Drummond Rennie, Melissa L Rethlefsen, Hannah R Rothstein, Maroeska M Rovers, Rebecca Ryan, Georgia Salanti, Ian J Saldanha, Margaret Sampson, Nancy Santesso, Rafael Sarkis-Onofre, Jelena Savović, Christopher H Schmid, Kenneth F Schulz, Guido Schwarzer, Beverley J Shea, Paul G Shekelle, Farhad Shokraneh, Mark Simmonds, Nicole Skoetz, Sharon E Straus, Anneliese Synnot, Emily E Tanner-Smith, Brett D Thombs, Hilary Thomson, Alexander Tsertsvadze, Peter Tugwell, Tari Turner, Lesley Uttley, Jeffrey C Valentine, Matt Vassar, Areti Angeliki Veroniki, Meera Viswanathan, Cole Wayant, Paul Whaley, and Kehu Yang. We thank the following contributors who provided feedback on a preliminary version of the PRISMA 2020 checklist: Jo Abbott, Fionn Büttner, Patricia Correia-Santos, Victoria Freeman, Emily A Hennessy, Rakibul Islam, Amalia (Emily) Karahalios, Kasper Krommes, Andreas Lundh, Dafne Port Nascimento, Davina Robson, Catherine Schenck-Yglesias, Mary M Scott, Sarah Tanveer and Pavel Zhelnov. We thank Abigail H Goben, Melissa L Rethlefsen, Tanja Rombey, Anna Scott, and Farhad Shokraneh for their helpful comments on the preprints of the PRISMA 2020 papers. We thank Edoardo Aromataris, Stephanie Chang, Toby Lasserson and David Schriger for their helpful peer review comments on the PRISMA 2020 papers.

Contributors: JEM and DM are joint senior authors. MJP, JEM, PMB, IB, TCH, CDM, LS, and DM conceived this paper and designed the literature review and survey conducted to inform the guideline content. MJP conducted the literature review, administered the survey and analysed the data for both. MJP prepared all materials for the development meeting. MJP and JEM presented proposals at the development meeting. All authors except for TCH, JMT, EAA, SEB, and LAM attended the development meeting. MJP and JEM took and consolidated notes from the development meeting. MJP and JEM led the drafting and editing of the article. JEM, PMB, IB, TCH, LS, JMT, EAA, SEB, RC, JG, AH, TL, EMW, SM, LAM, LAS, JT, ACT, PW, and DM drafted particular sections of the article. All authors were involved in revising the article critically for important intellectual content. All authors approved the final version of the article. MJP is the guarantor of this work. The corresponding author attests that all listed authors meet authorship criteria and that no others meeting the criteria have been omitted.

Funding: There was no direct funding for this research. MJP is supported by an Australian Research Council Discovery Early Career Researcher Award (DE200101618) and was previously supported by an Australian National Health and Medical Research Council (NHMRC) Early Career Fellowship (1088535) during the conduct of this research. JEM is supported by an Australian NHMRC Career Development Fellowship (1143429). TCH is supported by an Australian NHMRC Senior Research Fellowship (1154607). JMT is supported by Evidence Partners Inc. JMG is supported by a Tier 1 Canada Research Chair in Health Knowledge Transfer and Uptake. MML is supported by The Ottawa Hospital Anaesthesia Alternate Funds Association and a Faculty of Medicine Junior Research Chair. TL is supported by funding from the National Eye Institute (UG1EY020522), National Institutes of Health, United States. LAM is supported by a National Institute for Health Research Doctoral Research Fellowship (DRF-2018-11-ST2-048). ACT is supported by a Tier 2 Canada Research Chair in Knowledge Synthesis. DM is supported in part by a University Research Chair, University of Ottawa. The funders had no role in considering the study design or in the collection, analysis, interpretation of data, writing of the report, or decision to submit the article for publication.

Competing interests: All authors have completed the ICMJE uniform disclosure form at http://www.icmje.org/conflicts-of-interest/ and declare: EL is head of research for the BMJ ; MJP is an editorial board member for PLOS Medicine ; ACT is an associate editor and MJP, TL, EMW, and DM are editorial board members for the Journal of Clinical Epidemiology ; DM and LAS were editors in chief, LS, JMT, and ACT are associate editors, and JG is an editorial board member for Systematic Reviews . None of these authors were involved in the peer review process or decision to publish. TCH has received personal fees from Elsevier outside the submitted work. EMW has received personal fees from the American Journal for Public Health , for which he is the editor for systematic reviews. VW is editor in chief of the Campbell Collaboration, which produces systematic reviews, and co-convenor of the Campbell and Cochrane equity methods group. DM is chair of the EQUATOR Network, IB is adjunct director of the French EQUATOR Centre and TCH is co-director of the Australasian EQUATOR Centre, which advocates for the use of reporting guidelines to improve the quality of reporting in research articles. JMT received salary from Evidence Partners, creator of DistillerSR software for systematic reviews; Evidence Partners was not involved in the design or outcomes of the statement, and the views expressed solely represent those of the author.

Provenance and peer review: Not commissioned; externally peer reviewed.

Patient and public involvement: Patients and the public were not involved in this methodological research. We plan to disseminate the research widely, including to community participants in evidence synthesis organisations.

This is an Open Access article distributed in accordance with the terms of the Creative Commons Attribution (CC BY 4.0) license, which permits others to distribute, remix, adapt and build upon this work, for commercial use, provided the original work is properly cited. See: http://creativecommons.org/licenses/by/4.0/ .

  • Gurevitch J ,
  • Koricheva J ,
  • Nakagawa S ,
  • Liberati A ,
  • Tetzlaff J ,
  • Altman DG ,
  • PRISMA Group
  • Tricco AC ,
  • Sampson M ,
  • Shamseer L ,
  • Leoncini E ,
  • de Belvis G ,
  • Ricciardi W ,
  • Fowler AJ ,
  • Leclercq V ,
  • Beaudart C ,
  • Ajamieh S ,
  • Rabenda V ,
  • Tirelli E ,
  • O’Mara-Eves A ,
  • McNaught J ,
  • Ananiadou S
  • Marshall IJ ,
  • Noel-Storr A ,
  • Higgins JPT ,
  • Chandler J ,
  • McKenzie JE ,
  • López-López JA ,
  • Becker BJ ,
  • Campbell M ,
  • Sterne JAC ,
  • Savović J ,
  • Sterne JA ,
  • Hernán MA ,
  • Reeves BC ,
  • Whiting P ,
  • Higgins JP ,
  • ROBIS group
  • Hultcrantz M ,
  • Stewart L ,
  • Bossuyt PM ,
  • Flemming K ,
  • McInnes E ,
  • France EF ,
  • Cunningham M ,
  • Rethlefsen ML ,
  • Kirtley S ,
  • Waffenschmidt S ,
  • PRISMA-S Group
  • ↵ Higgins JPT, Thomas J, Chandler J, et al, eds. Cochrane Handbook for Systematic Reviews of Interventions : Version 6.0. Cochrane, 2019. Available from https://training.cochrane.org/handbook .
  • Dekkers OM ,
  • Vandenbroucke JP ,
  • Cevallos M ,
  • Renehan AG ,
  • ↵ Cooper H, Hedges LV, Valentine JV, eds. The Handbook of Research Synthesis and Meta-Analysis. Russell Sage Foundation, 2019.
  • IOM (Institute of Medicine)
  • PRISMA-P Group
  • Salanti G ,
  • Caldwell DM ,
  • Stewart LA ,
  • PRISMA-IPD Development Group
  • Zorzela L ,
  • Ioannidis JP ,
  • PRISMAHarms Group
  • McInnes MDF ,
  • Thombs BD ,
  • and the PRISMA-DTA Group
  • Beller EM ,
  • Glasziou PP ,
  • PRISMA for Abstracts Group
  • Mayo-Wilson E ,
  • Dickersin K ,
  • MUDS investigators
  • Stovold E ,
  • Beecher D ,
  • Noel-Storr A
  • McGuinness LA
  • Sarafoglou A ,
  • Boutron I ,
  • Giraudeau B ,
  • Porcher R ,
  • Chauvin A ,
  • Schulz KF ,
  • Schroter S ,
  • Stevens A ,
  • Weinstein E ,
  • Macleod MR ,
  • IICARus Collaboration
  • Kirkham JJ ,
  • Petticrew M ,
  • Tugwell P ,
  • PRISMA-Equity Bellagio group

prisma dalam literature review

To read this content please select one of the options below:

Please note you do not have access to teaching notes, prisma for review of management literature – method, merits, and limitations – an academic review.

Advancing Methodologies of Conducting Literature Review in Management Domain

ISBN : 978-1-80262-372-7 , eISBN : 978-1-80262-371-0

Publication date: 24 November 2023

Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) is a widely accepted guideline for performing a systematic review (SR) in clinical journals. It not only helps an author to improve the reporting but also assists reviewers and editors in the critical appraisal of available SR. These tools help in achieving reproducibility in research, a major concern in contemporary academic research. But there is a lack of awareness about the approach among management researchers. This chapter attempts to fill this gap using a narrative review of reliable online resources and peer-reviewed articles to discuss the PRISMA guidelines and recent amendments. The chapter further points out the limitations of PRISMA in the review of management literature and suggests measures to overcome that. This piece of literature introduces a reader to the basics of a systematic review using PRISMA as an instrument. One of the significant contributions is to delineate a seven-step strategy to attain reproducibility in the systematic review. The chapter is useful for researchers and academicians in the field of social science and management.

  • Systematic review
  • Review methods
  • PRISMA extensions
  • Reproducibility
  • Literature review

Mishra, V. and Mishra, M.P. (2023), "PRISMA for Review of Management Literature – Method, Merits, and Limitations – An Academic Review", Rana, S. , Singh, J. and Kathuria, S. (Ed.) Advancing Methodologies of Conducting Literature Review in Management Domain ( Review of Management Literature, Vol. 2 ), Emerald Publishing Limited, Leeds, pp. 125-136. https://doi.org/10.1108/S2754-586520230000002007

Emerald Publishing Limited

Copyright © 2024 Vinaytosh Mishra and Monu Pandey Mishra. Published under exclusive licence by Emerald Publishing Limited

We’re listening — tell us what you think

Something didn’t work….

Report bugs here

All feedback is valuable

Please share your general feedback

Join us on our journey

Platform update page.

Visit emeraldpublishing.com/platformupdate to discover the latest news and updates

Questions & More Information

Answers to the most commonly asked questions here

College of Staten Island Home

  • CSI Library Home
  • CSI Library

Physical Therapy 710

Literature review using prisma.

  • Finding the Evidence: Library Databases
  • Studies Defined / PICO
  • Finding the Evidence: Internet Resources
  • Manage References Using RefWorks
  • Physical Therapy Home
  • Open Educational Resources This link opens in a new window

PRISMA step-by-step

Need help with PRISMA? This video seminar might help: Systematic Literature Review using PRISMA: A Step-by-Step Guide, Dr Saeed Pahlevansharif, Faculty of Business and Law, Taylor's University

Coming Soon: PRISMA 2020 PRISMA 2020 explanation and elaboration: updated guidance and exemplars for reporting systematic reviews

** PRISMA latest news

PRISMA 2009 Checklist:  The Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) is a 27-item checklist used to improve transparency in systematic reviews. These items cover all aspects of the manuscript, including title, abstract, introduction, methods, results, discussion, and funding. 

prisma dalam literature review

The PRISMA Flow Diagram visually depicts the flow of studies through each phase of the review process.

prisma dalam literature review

  • << Previous: Studies Defined / PICO
  • Next: Finding the Evidence: Internet Resources >>

link to CSI Website

Facebook Twitter Instagram

  • URL: https://library.csi.cuny.edu/pht710
  • Last Updated: Apr 11, 2024 12:53 PM

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List

Logo of sysrev

PRISMA-S: an extension to the PRISMA Statement for Reporting Literature Searches in Systematic Reviews

Melissa l. rethlefsen.

1 Health Science Center Libraries, George A. Smathers Libraries, University of Florida, Gainesville, USA

Shona Kirtley

2 UK EQUATOR Centre, Centre for Statistics in Medicine (CSM), Nuffield Department of Orthopaedics, Rheumatology and Musculoskeletal Sciences (NDORMS), Botnar Research Centre, University of Oxford, Windmill Road, Oxford, OX3 7LD UK

Siw Waffenschmidt

3 Institute for Quality and Efficiency in Health Care, Cologne, Germany

Ana Patricia Ayala

4 Gerstein Science Information Centre, University of Toronto, Toronto, Canada

David Moher

5 Centre for Journalology, Clinical Epidemiology Program, Ottawa Hospital Research Institute, The Ottawa Hospital, General Campus, Centre for Practice Changing Research Building, 501 Smyth Road, PO BOX 201B, Ottawa, Ontario K1H 8L6 Canada

Matthew J. Page

6 School of Public Health and Preventive Medicine, Monash University, Melbourne, Australia

Jonathan B. Koffel

7 University of Minnesota, Minneapolis, USA

Associated Data

All data is available via the PRISMA-S PRISMA Search Reporting Extension OSF site (10.17605/OSF.IO/YGN9W) [ 32 ]. This includes all data relating to item development, survey instruments, data from the Delphi surveys, and consent documents.

Literature searches underlie the foundations of systematic reviews and related review types. Yet, the literature searching component of systematic reviews and related review types is often poorly reported. Guidance for literature search reporting has been diverse, and, in many cases, does not offer enough detail to authors who need more specific information about reporting search methods and information sources in a clear, reproducible way. This document presents the PRISMA-S (Preferred Reporting Items for Systematic reviews and Meta-Analyses literature search extension) checklist, and explanation and elaboration.

The checklist was developed using a 3-stage Delphi survey process, followed by a consensus conference and public review process.

The final checklist includes 16 reporting items, each of which is detailed with exemplar reporting and rationale.

Conclusions

The intent of PRISMA-S is to complement the PRISMA Statement and its extensions by providing a checklist that could be used by interdisciplinary authors, editors, and peer reviewers to verify that each component of a search is completely reported and therefore reproducible.

Supplementary Information

The online version contains supplementary material available at 10.1186/s13643-020-01542-z.

Introduction

One crucial component of a systematic review is the literature search. The literature search, or information retrieval process, not only informs the results of a systematic review; it is the underlying process that establishes the data available for analysis. Additional components of the systematic review process such as screening, data extraction, and qualitative or quantitative synthesis procedures are dependent on the identification of eligible studies. As such, the literature search must be designed to be both robust and reproducible to ensure the minimization of bias.

Guidelines exist for both the conduct of literature searches (Table ​ (Table2) 2 ) for systematic reviews and their reporting [ 2 – 7 ]. Problematically, however, the many guidelines for reporting systematic review searches share few common reporting elements. In fact, Sampson et al. discovered that of the eleven instruments designed to help authors report literature searches well, only one item appeared in all eleven instruments [ 8 ]. Though Sampson et al.’s study was conducted in 2007, the problem has only been compounded as new checklists and tools have continued to be developed. The most commonly used reporting guidance for systematic reviews, which covers the literature search component, is the Preferred Reporting Items for Systematic reviews and Meta-Analyses Statement, or PRISMA Statement [ 9 ]. The 2009 PRISMA Statement checklist included three items related to literature search reporting, items 7, 8, and 17:

Item 7: Describe all information sources (e.g., databases with dates of coverage, contact with study authors to identify additional studies) in the search and date last searched.
Item 8: Present full electronic search strategy for at least one database, including any limits used, such that it could be repeated.
Item 17: Give numbers of studies screened, assessed for eligibility, and included in the review, with reasons for exclusions at each stage, ideally with a flow diagram.

Despite wide usage of the PRISMA Statement [ 10 ], compliance with its items regarding literature search reporting is low [ 11 – 14 ]. Even for those studies which explicitly reference PRISMA, there is only slight, statistically non-significant evidence of improved reporting, as found by Page et al. [ 15 ]. Part of the challenge may be the multifactorial nature of each of the PRISMA items relating to searches; authors may feel if they completed one of the components of the item that they can check off that item altogether. Another part of the challenge may be that many systematic reviews do not include librarians or information specialists as members of the systematic review team or as authors on the final manuscript [ 11 , 16 – 18 ]. Preliminary research suggests that librarian or information specialist involvement is correlated with reproducibility of searches [ 16 – 18 ], likely due to their expertise surrounding search development and documentation. However, reviews where librarians are authors still include reproducible searches only 64% of the time [ 17 ].

A larger issue may be that, even amongst librarians and information specialists, debate exists as to what constitutes a reproducible search and how best to report the details of the search. Researchers assessing the reproducibility of the search have used varying methods to determine what constitutes a reproducible search [ 11 , 17 , 19 , 20 ]. Post-publication peer review of search methods, even amongst Cochrane reviews, which generally have superior reporting compared to non-Cochrane reviews [ 15 ], has shown that reporting that appears complete may still pose challenges for those wishing to reproduce searches [ 20 – 24 ]. Furthermore, little guidance on how to report searches using information sources or methods other than literature databases, such as searching web sites or study registries, exists [ 25 , 26 ].

Incomplete reporting of the literature search methods can introduce doubt and diminish trust in the final systematic review conclusions. If researchers are unable to understand or reproduce how information was gathered for a systematic review, they may suspect the authors of having introduced bias into their review by not conducting a thorough or pre-specified literature search. After observing the high number of systematic reviews with poorly reported literature searches, we sought to create an extension to the PRISMA statement. Our aims were four-fold:

  • To provide extensive guidance on reporting the literature search components of a systematic review.
  • To create a checklist that could be used by authors, editors, and peer reviewers to verify that each component of a search was completely reported and therefore reproducible.
  • To develop an interdisciplinary checklist applicable to all method-driven literature searches for evidence synthesis.
  • To complement the PRISMA Statement and its extensions.

Because we intend the checklist to be used in all fields and disciplines, we use “systematic reviews” throughout this document as a representative name for the entire family of evidence syntheses [ 27 ]. This includes, but is not limited to, scoping reviews, rapid reviews, realist reviews, metanarrative reviews, mixed methods reviews, umbrella reviews, and evidence maps [ 28 ]. We use the term “literature search” or “search” throughout to encompass the full range of possible search methods and information sources.

Part 1: Developing the Checklist

After consultation with members of the PRISMA Statement steering group (D.M. and D.G.A.), we formed an executive committee (M.L.R, J.K., S.K.) and developed a protocol [ 29 ] according to the steps outlined in the “Guidance for Developers of Health Research Reporting Guidelines [ 30 ].” The protocol was registered on the EQUATOR Network [ 29 ]. We identified 405 potential items relevant to reporting searches in systematic reviews from 61 sources (see Additional file 1 ) located through a search of MEDLINE via Ovid, Embase via Embase.com , and LISTA via EBSCOhost, in addition to reviewing all of the sources identified by the EQUATOR Network relating to systematic reviews. We also searched our personal files and examined references of included documents for additional sources. Details of the search are available in Additional file 1 . Sources included both explicit reporting guidelines and studies assessing reproducibility of search strategies. The 405 items were reviewed for overlap and consolidated into 123 remaining items for potential inclusion in a checklist.

To narrow the list into a usable checklist, we then used a three-step Delphi survey process [ 31 ]. The first survey included the initially identified 123 items and asked respondents to rate each item on a 4-point Likert-type scale. Items that 70% of experts rated as 3 or 4 (4 being “essential” and 1 “not important”) and that received a mean score of at least 3.25 were retained for rating in the second round of the Delphi process. Respondents to the first survey were invited to participate in the second and third rounds. The second round asked respondents to pick the 25 most essential items out of the remaining 53 potential items; the third round was identical, except respondents also selected the most appropriate location for reporting their selected items (e.g., in the main text, or a supplementary file). The items were ranked and categorized by general theme for discussion at an in-person consensus conference.

We created a list of one hundred and sixty-three international experts, including librarian and information specialists with expertise in systematic reviews, researchers who had written about systematic review reporting, journal editors, and systematic review methodologists, to whom we sent our initial Delphi survey. The list of experts was created using a combination of publications, mailing lists, conference proceedings, and knowledge of the authors to represent research groups and experts in 23 countries. We received 52 responses (32% response rate) to the first survey, and of these, 35 (67% response rate) completed both surveys two and three. This study was declared exempt by the University of Utah Institutional Review Board (IRB_00088425).

The results of the Delphi process were reported at a consensus conference meeting that took place in May 2016 concurrently with Mosaic ‘16, the joint meeting of the Medical Library Association, Canadian Health Libraries Association/Association des bibliothèques de la santé du Canada, and the International Clinical Librarian Conference (ICLC). 38 individuals attended the consensus conference, 14 (37%) of whom had participated in the Delphi surveys. At the consensus conference, the grouped and ranked remaining items were distributed to small groups who were asked to discuss, consolidate, remove, or add missing critical items under the guidance of a group leader. After two rounds of discussion, the group leaders presented the discussion and proposed list items from their small groups for consideration by the whole group of experts.

Upon completion of the consensus conference, 30 items remained from those identified during the Delphi process, with an additional three items that had been excluded during the Delphi process added back to the draft checklist because meeting attendees considered them critical to the guideline. The list was then consolidated and reviewed by executive committee members, including two new information specialist members (S.W. and A.P.A). The draft checklist and explanation and elaboration document was released to the public on March 20, 2019, along with all data and study materials [ 32 ]. All participants in the Delphi process and/or consensus conference were contacted via email with instructions on how to provide feedback on the draft checklist items and/or elaboration and explanation document by commenting directly on the explanation and elaboration draft using a private commenting system, Hypothesis [ 33 ], or if preferred, via email. Comments from other interested individuals were solicited via Twitter, conference presentations, and personal contacts. Comments were collected from the private Hypothesis group, the public Hypothesis comments, and via email. All comments were combined into a single document. Executive committee members reviewed each comment in duplicate to indicate what type of feedback was received (i.e., linguistic, major substantive, minor substantive, or unclear) and, for substantive comments, whether change was recommended or required further discussion.

During the draft and revision process (March 20–June 15, 2019), 358 separate comments were received from 22 individuals and organizations. Based upon the extensive feedback received, the executive team revised the checklist and developed the next iteration, which was released on December 6, 2019, to coincide with the 2019 Virtual Cochrane Colloquium Santiago. Additional feedback from this release was incorporated into the final checklist. Throughout the draft and revision process, several teleconferences were held with the lead of the PRISMA 2020 statement (M.J.P), as an update of the 2009 PRISMA statement was in development, to ensure that the content on search methods was consistent between the PRISMA 2020 and PRISMA-S guidelines [ 34 , 35 ].

Part 2: Checklist

PRISMA-S is a 16-item checklist that covers multiple aspects of the search process for systematic reviews. It is intended to guide reporting, not conduct, of the search. The checklist should be read in conjunction with the Explanation and Elaboration (Part 3), which provides more detail about each item. We also include two boxes, one a glossary of terms (see Table ​ Table2) 2 ) and the other, guidance on depositing search data and method descriptions in online repositories (see Table ​ Table3 3 ).

Supplementary Materials

The Explanation and Elaboration also includes examples of good reporting for each item. Each exemplar is drawn from published systematic reviews. For clarity, some exemplars are edited to match the style of this document, including any original citations, and abbreviations are spelled out to aid comprehension. Any other edits to the text are noted with square brackets. A description of the rationale behind the item is explained, followed by additional suggestions for clear reporting and a suggested location(s) for reporting the item.

Not every systematic review will make use of all of the items in the Information Sources and Methods section of the checklist, depending on the research question and the methods chosen by the authors. The checklist provides a framework for the current most common and recommended types of information sources and methods for systematic reviews, but authors should use and report those items relevant and appropriate to their review. The checklist may also be used for systematic review protocols to fully document the planned search, in conjunction with the PRISMA-P reporting guideline [ 36 ] (Table ​ (Table1 1 ).

PRISMA-S checklist. A downloadable version of the checklist is available on the PRISMA website [ 37 ]

Part 3: Explanation and Elaboration

Item 1. database name.

Name each individual database searched, stating the platform for each.

“The following electronic databases were searched: MEDLINE (Ovid), CINAHL (EBSCOhost), PsycINFO (Ovid), Cochrane Central Register of Controlled Trials (Ovid), SPORTDiscus (EBSCOhost), EMBASE (Ovid) and ProQuest Dissertations and Theses Global (ProQuest).” [ 38 ]

Explanation

Databases are the most commonly used tool to locate studies to include in systematic reviews and meta-analyses [ 6 , 39 ]. There is no single database that is able to provide a complete and accurate list of all studies that meet systematic review criteria due to the differences in the articles included and the indexing methods used between databases (Table ​ (Table2). 2 ). These differences have led to recommendations that systematic review teams search multiple databases to maximize the likelihood of finding relevant studies [ 6 , 39 , 40 ]. This may include using broad disciplinary databases (e.g., MEDLINE [ 41 ], Embase [ 42 ], Scopus [ 43 ]), specialized databases (e.g., PsycINFO [ 44 ] or EconLit [ 45 ]), or regional databases (e.g., LILACS [ 46 ] or African Index Medicus [ 47 ]).

Many of these literature databases are available through multiple different search platforms (Table ​ (Table2). 2 ). For example, the MEDLINE database is available through at least 10 different platforms, including Ovid, EBSCOhost, Web of Science, and PubMed. Each platform offers different ways of searching the databases, such as platform-specific field codes (Table ​ (Table2), 2 ), phrase searching, truncation, or searching full-text versus abstract and keyword only [ 48 ]. Different platforms may contain additional data that are not available in the original database, such as times cited, social media impact, or additional keywords. These differences between the platforms can have a meaningful impact on the results provided [ 48 – 50 ].

Authors should identify which specific literature databases were searched to locate studies included in the systematic review. It is important that authors indicate not only the database, but the platform through which the database was searched. This helps readers to evaluate the quality and comprehensiveness of the search and supports reproducibility and updating (Table ​ (Table2) 2 ) in the future by allowing the strategy to be copied and pasted as recommended in Item 8, below.

The distinctions between database and platform may not always be clear to authors, especially when the database is the only one available through a platform (e.g., Scopus [ 43 ]). In these cases, authors may choose to include the web address of the database in the text or the bibliography to provide clarity for their readers.

Suggested location for reporting

Report each database name and platform in the methods section and any supplementary materials (Table ​ (Table2). 2 ). If space permits, report key database names in the abstract.

Item 2. Multi-database searching

If databases were searched simultaneously on a single platform, state the name of the platform, listing all of the databases searched.

“The MEDLINE and Embase strategies were run simultaneously as a multi-file search in Ovid and the results de-duplicated using the Ovid de-duplication tool.” [ 51 ]
“A systematic literature search was performed in Web of Knowledge™ (including KCI Korean Journal Database, MEDLINE, Russian Science Citation Index, and SciELO Citation Index)….” [ 52 ]

Authors may choose to search multiple databases at once through a single search platform to increase efficiency. Along with the name of the platform, it is necessary to list the names of each of the individual databases included as part of the search. Including information about using this approach in the text of the manuscript helps readers immediately understand how the search was constructed and executed. This helps readers determine how effective the search strategy (Table ​ (Table2) 2 ) will be for each database [ 1 ].

Report any multi-database search (Table ​ (Table2) 2 ) in the methods section and any supplementary materials. If space permits, report key individual database names in the abstract, even if run through a multi-database search.

Item 3. Study registries

List any study registries searched.

“[We] searched several clinical trial registries ( ClinicalTrials.gov , Current Controlled Trials ( www.controlled-trials.com ), Australian New Zealand Clinical Trials Registry ( www.actr.org.au ), and University Hospital Medical Information Network Clinical Trials Registry ( www.umin.ac.jp/ctr )) to identify ongoing trials.” [ 53 ]

Study registries are a key source of information for systematic reviews and meta-analyses in the health sciences and increasingly in other disciplines. In the health sciences, study registries (Table ​ (Table2) 2 ) allow researchers to locate ongoing clinical trials and studies that may have gone unpublished [ 54 – 56 ]. Some funders, including the National Institutes of Health, require principal investigators to share their data on study registries within a certain time frame after grant completion [ 57 ]. This data may not have been published in any other location, making study registries a critical component of an information strategy, though timely reporting remains a challenge [ 58 , 59 ]. Different countries have their own study registries, as do many pharmaceutical companies.

Outside the health sciences, study registries are becoming increasingly important as many disciplines adopt study pre-registration as a tactic for improving the rigor of research. Though not yet as established as in the health sciences, these study registries are continually expanding and will serve as key sources for finding unpublished studies in fields in the social sciences and beyond.

To fully describe the study registries searched, list the name of each study registry searched, and include a citation or link to the study registry.

Report any study registries searched in the methods section and any supplementary materials.

Item 4. Online resources and browsing

Describe any online or print source purposefully searched or browsed (e.g., tables of contents, print conference proceedings, web sites), and how this was done.

“ We also searched the grey literature using the search string: “public attitudes” AND “sharing” AND “health data” on Google (in June 2017). The first 20 results were selected and screened.” [ 60 ]
“The grey literature search was conducted in October 2015 and included targeted, iterative hand searching of 22 government and/or research organization websites that were suggested during the expert consultation and are listed in S1 Protocol. Twenty two additional citations were added to the review from the grey literature search.” [ 61 ]
“To locate unpublished studies, we searched Embase [via Embase.com ] for conference proceedings since 2000 and hand-searched meeting abstracts of the Canadian Conference on Physician Health and the International Conference on Physician Health (2012 to 2016).” [ 62 ]

Systematic reviews were developed to remove as much bias as possible from the literature review process. One of the most important ways they achieve this reduction in bias is by searching beyond literature databases, which are skewed towards English-language publications with positive results [ 63 , 64 ]. To achieve a fuller picture of what the research on a specific topic looks like, systematic reviewers could seek out research that may be in progress and research that was never published [ 6 ]. Using other methods of finding research also helps identify research that may have been indexed in literature databases, but went undiscovered when searching those sources [ 40 ]. Seeking out this research often involves a complex strategy, drawing on a wealth of online and print resources as well as personal contacts.

Web search engines and specific web sites

Searching general internet search engines and searching the contents of specific websites is a key component of many systematic reviews [ 26 , 65 ]. Government, non-profit organization, and pharmaceutical company websites, for example, contain a wealth of information not published elsewhere [ 6 , 66 ]. Though searching a general search engine like Google or using a general search engine to search a specific website may introduce some bias into the search methodology through the personalization algorithms inherent in many of these tools [ 67 , 68 ], it is still important to fully document how web searches were conducted [ 65 ].

Authors should list all websites searched, along with their corresponding web address. Readers should be able to clearly understand if researchers used a website’s native search interface or advanced search techniques within a general search engine. If authors used a general search engine, authors should declare whether steps were taken to reduce personalization bias (e.g., using “incognito” mode in a browser). Authors may choose whether to detail the websites searched within the text (i.e., Google ( http://www.google.com )), by citing the websites in the bibliography, or by listing the website with corresponding web address in supplementary material, as shown in the examples above.

Review teams may occasionally set an artificial limit to the number of items they will screen from a given search or source [ 65 ]. This is because searching web search engines and individual websites will often lead to an unmanageable number of results, the search engine itself may only display a restricted number of results (e.g., Google will only display 1000 results), or the team has a finite budget or timeline to complete the review. Thus, many systematic review teams utilizing web search engines will often pre-designate a limit to the number of results they review. If review teams choose to review a limited set of results, it should be noted in the text, along with the rationale.

Conference proceedings

Studies show that large percentages of research presented as papers and posters at conferences never make their way into the published literature, particularly if the study’s results were statistically negative [ 63 , 69 ]. Conference proceedings are often the only way to locate these studies. Including conference proceedings in a systematic review search helps minimize bias [ 70 ]. The introduction of online conference proceedings has been a boon to researchers and reduced the need to review printed abstract books. Additionally, some databases either include conference proceedings along with journal articles (i.e., Embase [ 42 ]) or contain only conference proceedings (i.e., ProceedingsFirst [ 71 ] or Directory of Published Papers [ 72 ]). Some conferences have made their abstracts available in a single database (i.e., International AIDS Society’s Abstract Archive [ 73 ]). When using these types of databases to search conference proceedings, authors can treat them as above in Item 1.

Individual conferences’ online proceedings may be password-protected for association members or conference attendees [ 74 ]. When reporting on conference proceedings searched or browsed (Table ​ (Table2) 2 ) via a conference or association’s online or print proceedings, authors must specify the conference names, the dates of conferences included, and the method used to search the proceedings (i.e., browsing print abstract books or using an online source). If the conference proceedings are searched online, authors should specify the web address(es) for the conference proceedings and the date(s) of the conferences. If the conference proceedings are published in a journal, the authors should cite the journal. If the proceedings are a standalone publication, authors may choose to cite them using the same methods used to cite a book or by providing the full information about the conference (name, location, dates, etc.) in a supplementary file.

General browsing

Authors also commonly browse print or online tables of contents, full contents of journals, or other sources that are the most likely to contain research on the topic sought. When purposefully browsing, describe any method used, the name of the journal or other source, and the time frame covered by the search, if applicable.

Report online information sources (Table ​ (Table2) 2 ) searched or browsed in the methods section and in any supplementary materials. Systematic reviews using several of these methods, or using multiple information sources for each method, may need to report their methods briefly in the methods section, but should fully report all necessary information to describe their approaches in supplementary materials.

Item 5. Citation searching

Indicate whether cited references or citing references were examined, and describe any methods used for locating cited/citing references (e.g., browsing reference lists, using a citation index, setting up email alerts for references citing included studies).

“Reference lists of included articles were manually screened to identify additional studies.” [ 75 ]
“[W]e used all shared decision making measurement instruments that were identified in Gärtner et al’s recent systematic review (Appendix A). We then performed a systematic citation search, collecting all articles that cited the original papers reporting on the development, validation, or translation of any the observational and/or self-reported shared decision making measurement instruments identified in that review. An experienced librarian (P.J.E.) searched Web of Science [Science Citation Index] and Scopus for articles published between January 2012 and February 2018.” [ 76 ]
“We [conducted] citation tracking of included studies in Web of Science Core Collection on an ongoing basis, using citation alerts in Web of Science Core Collection.” [ 77 ]

One of the most common search methods is reviewing the references or bibliographies of included studies [ 11 , 17 ]. This type of citation searching (looking for cited references) can be additive to other cited reference searching methods, such as examining bibliographies of relevant systematic reviews. In addition, researchers may choose to look for articles that cite specified studies [ 78 ]. This may include looking beyond one level forwards and backwards (e.g., examining the bibliographies of articles cited by specified articles) [ 78 ]. Looking at bibliographies of included articles or other specified articles is often conducted by examining full-text articles, but it can also be accomplished using online tools called citation indexes (Table ​ (Table2 2 ).

The use of these methods can be complicated to describe, but the explanation should clearly state the database used, if applicable (i.e., Scopus, Google Scholar, Science Citation Index) and describe any other methods used. Authors also must cite the “base” article(s) that citation searching was performed upon, either for examining cited or citing articles (Table ​ (Table2). 2 ). If the same database is used for both a topical search as well as citation searching, describe each use separately. For manually checking the reference lists for included articles, a simple statement as in the first example is sufficient.

Report citation searching details in the methods section and in any supplementary materials.

Item 6. Contacts

Indicate whether additional studies or data were sought by contacting authors, experts, manufacturers, or others.

“We contacted representatives from the manufacturers of erythropoietin-receptor agonists (Amgen, Ortho-Biotech, Roche), corresponding or first authors of all included trials and subject-area experts for information about ongoing studies.” [ 79 ]
“We also sought data via expert requests. We requested data on the epidemiology of injecting drug use and blood-borne viruses in October, 2016, via an email distribution process and social media. This process consisted of initial emails sent to more than 2000 key experts and organisations, including contacts in the global, regional, and country offices of WHO, UNAIDS, Global Fund, and UNODC (appendix p 61). Staff in those agencies also forwarded the request to their colleagues and other relevant contacts. One member of the research team (SL) posted a request for data on Twitter, which was delivered to 5525 individual feeds (appendix p 62).” [ 80 ]

Contacting manufacturers (e.g., pharmaceutical companies), or reaching out to authors or experts directly or through organizations, is a key method to locate unpublished and ongoing studies [ 6 ]. Contacting authors or manufacturers may also be necessary when publications, conference proceedings, or clinical trials registry records do not provide the complete information needed [ 63 , 81 ]. Contacting manufacturers or regulating agencies might be required to acquire complete trial data from the clinical study reports [ 82 , 83 ]. More broad calls for evidence may also be conducted when no specific groups or individuals are targeted.

Contact methods may vary widely, depending on the context, and may include personal contact, web forms, email mailing lists, mailed letters, social media contacts, or other methods. As these strategies are inherently difficult to reproduce, researchers should attempt to give as much detail as possible on what data or information was sought, who or what group(s) provided data or information, and how the individuals or groups were identified.

Report information about contacts to solicit additional information in the methods section and in any supplementary materials. Systematic reviews using elaborate calls for evidence or making extensive use of contacts as an information source may need to report their methods briefly in the methods section, but should fully report all necessary information to describe their approaches in supplementary materials.

Item 7. Other methods

Describe any additional information sources or search methods used.

“We also searched… our personal files.” [ 84 ]
“PubMed’s related articles search was performed on all included articles.” [ 85 ]

A thorough systematic review may utilize many additional methods of locating studies beyond database searching, many of which may not be reproducible methods. A key example is searching personal files. Another is using databases’ built in tools, such as PubMed’s Related Articles feature [ 86 ] or Clarivate Analytics’ Web of Science’s Related Records feature [ 87 ], to locate relevant articles based on commonalities with a starting article. Because these tools are often proprietary and their algorithms opaque, researchers may not be able to replicate the exact results at a later date. To attempt to be as transparent as possible, researchers should both note the tool that was used and cite any articles these operations were run upon. For all “other” methods, it is still important to declare that the method was used, even if it may not be fully replicable.

Report information about any other additional information sources or search methods used in the methods section and in any supplementary materials.

Item 8. Full search strategies

Include the search strategies for each database and information source, copied and pasted exactly as run.

Database search. Methods section description . “The reproducible searches for all databases are available at DOI:10.7302/Z2VH5M1H.” [ 88 ]
Database search. One of the full search strategies from supplemental materials in online repository . “ Embase.com (692 on Jan 19, 2017) 'social media'/exp OR (social NEAR/2 (media* OR medium* OR network*)):ti OR twitter:ti OR youtube:ti OR facebook:ti OR linkedin:ti OR pinterest:ti OR microblog*:ti OR blog:ti OR blogging:ti OR tweeting:ti OR 'web 2.0':ti 'professionalism'/exp OR 'ethics'/exp OR 'professional standard'/de OR 'professional misconduct'/de OR ethic*:ab,ti OR unprofessional*:ab,ti OR professionalism:ab,ti OR (professional* NEAR/3 (standard* OR misconduct)):ab,ti OR ((professional OR responsib*) NEAR/3 (behavi* OR act OR conduct*)):ab,ti #1 AND #2 AND [english]/lim NOT ('conference abstract':it OR 'conference paper':it) [ 88 ]
Online resources and browsing. Methods section description . “The approach to study identification from this systematic review is transparently reported in the Electronic Supplementary Material Appendix S1.” [ 89 ]
Online resources and browsing. One of the full online resource search strategies reported in supplement . “Date: 12/01/16. Portal/URL: Google. https://www.google.co.uk/webhp?hl=en . Search terms: ((Physical training) and (man or men or male or males) and (female or females or women or woman) and (military)). Notes: First 5 pages screened on title (n=50 records).” [ 89 ]

Systematic reviews and related review types rely on thorough and complex search strategies to identify literature on a given topic. The search strategies used to conduct this data gathering are essential to the transparency and reproducibility of any systematic review. Without being able to assess the quality of the search strategies used, readers are unable to assess the quality of the systematic review [ 9 , 11 , 17 ].

When space was at a premium in publications, complete reporting of search strategies was a challenge. Because it was necessary to balance the need for transparency with publication restrictions, previous PRISMA guidelines recommended including the complete search strategy from a minimum of one database searched [ 9 ]. Many systematic reviews therefore reported only the minimum necessary. However, reporting only selected search strategies can contribute to the observed irreproducibility of many systematic reviews [ 11 , 17 ].

The prior versions of PRISMA did not elaborate on methods for reporting search strategies outside of literature databases. Subsequent to its publication, many groups have begun identifying the challenges of fully documenting other types of search methods [ 90 , 91 ]. Now recommended is the explicit documentation of all of the details of all search strategies undertaken [ 91 , 92 ]. These should be reported to ensure transparency and maximum reproducibility, including searches and purposeful browsing activities undertaken in web search engines, websites, conference proceeding databases, electronic journals, and study registries.

Journal restrictions vary, but many journals now allow authors to publish supplementary materials with their manuscripts. At minimum, all search strategies, including search strategies for web search engines, websites, conference proceedings databases, electronic journals, and study registries, should be submitted as a supplement for publication. Though most supplements are appropriately accessible on journal publishers’ web sites as submitted, supplements may disappear [ 17 ]. In addition, many supplements are only available to journal subscribers [ 93 ]. Similarly, manuscripts available on public access systems like PubMed Central [ 94 ] may not have the corresponding supplemental materials properly linked. For optimal accessibility, authors should upload complete documentation to a data repository (Table ​ (Table2), 2 ), an institutional repository, or other secure and permanent online archive instead of relying on journal publication (see Table ​ Table3 3 for additional information).

It is important to document and report the search strategy exactly as run, typically by copying and pasting the search strategy directly as entered into the search platform. This is to ensure that information such as the fields searched, term truncation, and combinations of terms (i.e., Boolean logic or phrases) are accurately recorded. Many times, the copied and pasted version of a search strategy will also include key information such as limits (see Item 9; Table ​ Table2) 2 ) used, databases searched within a multi-database search, and other database-specific detail that will enable more accurate reporting and greater reproducibility. This documentation must also repeat the database or resource name, database platform or web address, and other details necessary to clearly describe the resource.

Report the full search strategy in supplementary materials as described above. Describe and link to the location of the supplementary materials in the methods section.

Item 9: Limits and restrictions

Specify that no limits were used, or describe any limits or restrictions applied to a search (e.g., date or time period, language, study design) and provide justification for their use.

No limits . “We imposed no language or other restrictions on any of the searches.” [ 95 ]
Limits described without justification . “The search was limited to the English language and to human studies.” [ 96 ]
“The following search limits were then applied: randomized clinical trials (RCTs) of humans 18 years or older, systematic reviews, and meta-analyses.” [ 97 ]
Limits described with justification . “The search was limited to publications from 2000 to 2018 given that more contemporary studies included patient cohorts that are most reflective of current co-morbidities and patient characteristics as a result of the evolving obesity epidemic.” [ 98 ]
Limits described, one with justification . “Excluded publication types were comments, editorials, patient education handouts, newspaper articles, biographies, autobiographies, and case reports. All languages were included in the search result; non-English results were removed during the review process…. To improve specificity, the updated search was limited to human participants.” [ 99 ]

Many databases have features that allow searchers to quickly restrict a search using limits. What limits are available in a database are unique to both the database and the platform used to search it. Limits are dependent on the accuracy of the indexer, the timeliness of indexing, and the quality of any publisher-supplied data. For instance, using database limits to restrict searches to randomized controlled trials will only find records identified by the indexer as randomized controlled trials. Since the indexing may take 6 months or more to complete for any given article, searchers risk missing new articles when using database limits.

Using database-provided limit features should not be confused with using filters (see Item 10; Table ​ Table2) 2 ) or inclusion criteria for the systematic review. For example, systematic review teams may choose to only include English-language randomized controlled trials. This can be done using limits, a combination of a filter (see Item 10) and screening, or screening alone. It should be clear to the reader which approach is used. For instance, in the “ Limits described, with one justification ” example above, the authors used database limits to restrict their search by publication type, but they did not use database limits to restrict by language, even though that was a component of their eligibility criteria. They also used database limits to restrict to human participants in their search update.

It is important for transparency and reproducibility that any database limits applied when running the search are reported accurately, as their use has high potential for introducing bias into a search [ 1 , 64 , 100 , 101 ]. Database limits are not recommended for use in systematic reviews, due to their fallibility [ 39 , 100 ]. If used, review teams should include a statement of justification for each use of a database limit in the methods section, the limitations section, or both [ 102 , 103 ]. In the examples above, only the last two examples provide some justification in the methods section (“to improve specificity” [ 99 ] and “contemporary studies included patient cohorts that are most reflective of current co-morbidities and patient characteristics as a result of the evolving obesity epidemic” [ 98 ]).

Report any limits or restrictions used or that no limits were used in the abstract, methods section, and in any supplementary materials, including the full search strategies (Item 8). Report the justification for any limits used within the methods section and/or in the limitations section.

Item 10. Search filters

Indicate whether published search filters were used (as originally designed or modified), and if so, cite the filter(s) used.

“For our MEDLINE search we added a highly sensitive filter for identifying randomised trials developed by the Cochrane Collaboration [38]. For Embase we used the filter for randomised trials proposed by the Scottish Intercollegiate Guidelines Network [ 104 ].” [ 105 ]

Filters are a predefined combination of search terms developed to identify references with a specific content, such as a particular type of study design (e.g., randomized controlled trials) [ 106 ], populations (e.g., the elderly), or a topic (e.g., heart failure) [ 107 ]. They often consist of a combination of subject headings, free-text terms, and publication types [ 107 ]. For systematic reviews, filters are generally recommended for use instead of limits built into databases, as discussed in Item 9, because they provide the much higher sensitivity (Table ​ (Table2) 2 ) required for a comprehensive search [ 108 ].

Any filters used as part of the search strategy should be cited, whether published in a journal article or other source. This enables readers to assess the quality of the filter(s) used, as most published search filters are validated and/or peer reviewed [ 106 , 107 ]. Many commonly used filters are published on the InterTASC Information Specialists’ Sub-Group [ 109 ], in the Cochrane Handbook [ 4 , 39 ], and through the Health Information Research Unit of McMaster University [ 110 ].

Cite any search filter used in the methods section and describe adaptations made to any filter. Include the copied and pasted details of any search filter used or adapted for use as part of the full search strategy (Item 8).

Item 11. Prior work

Indicate when search strategies from other literature reviews were adapted or reused for a substantive part or all of the search, citing the previous review(s).

“We included [search strategies] used in other systematic reviews for research design [ 111 ], setting [ 112 , 113 ], physical activity and healthy eating [ 114 – 116 ], obesity [ 111 ], tobacco use prevention [ 117 ], and alcohol misuse [ 118 ]. We also used a search [strategy] for intervention (implementation strategies) that had been employed in previous Cochrane Reviews [ 119 , 120 ], and which was originally developed based on common terms in implementation and dissemination research.” [ 121 ]

Many authors may also examine previously published search strategies to develop the search strategies for their review. Sometimes, authors adapt or reuse these searches for different systematic reviews [ 122 ]. When basing a new search strategy on a published search strategy, it is appropriate to cite the original publication(s) consulted.

Search strategies differ from filters (Item 10) because search strategies are often developed for a specific project, not necessarily designed to be repeatedly used. Filters, on the other hand, are developed with the express purpose of reuse. Filters are often objectively derived, tested, and validated, whereas most search strategies published as part of systematic review or other evidence synthesis are “best guess,” relying on the expertise of the searcher and review team [ 107 ].

As in the example above, researchers may rely on multiple prior published searches to construct a new search for a novel review. Many times, researchers will use the same searches from a published systematic review to update the existing systematic review. In either case, it is helpful to the readers to understand whether major portions of a search are being adapted or reused.

Report any prior work consulted, adapted, or reused in the methods section. Include the copied and pasted search strategies used, including portions or the entirety of any prior work used or adapted for use, in the full search strategy (Item 8).

Item 12. Updates

Report the methods used to update the search(es) (e.g., rerunning searches, email alerts).

“Ovid Auto Alerts were set up to provide weekly updates of new literature until July 09, 2012.” [ 123 ]
“ Two consecutive searches were conducted and limited by publication type and by date, first from January 1, 1990, to November 30, 2012, and again from December 1, 2012, to July 31, 2015, in an updated search…. The original search strategy was used to model the updated search from December 1, 2012, to July 31, 2015. The updated search strategy was consistent with the original search; however, changes were required in the ERIC database search because of a change in the ERIC search algorithm. Excluded publication types were identical to the initial search. To improve specificity, the updated search was limited to human participants.” [ 99 ]

The literature search is usually conducted at the initial stage of the production of a systematic review. As a consequence, the results of a search may be outdated before the review is published [ 124 – 126 ]. The last search in a review should be conducted ideally less than 6 months before publication [ 90 , 92 , 125 ]. For this reason, authors often update searches by rerunning (Table ​ (Table2) 2 ) the same search(es) or otherwise updating searches before the planned publication date. Updating searches differs from updating a systematic review, i.e., when the same or different authors or groups decide to redo a published systematic review to bring its findings up to date. If authors are updating a published systematic review, either authored by the same review team or another, Item 11 contains relevant guidance.

When reporting search updates, the extent of reporting depends on methods used and any changes that were made while updating the searches. If there are no changes in information sources and/or search syntax (Table ​ (Table2), 2 ), it is sufficient to indicate the date the last search was run in the methods section and in the supplementary materials. If there are any changes in information sources and/or search syntax, the changes should be indicated (e.g., different set of databases, changes in search syntax, date restrictions) in the methods section. Authors should explain why these changes were made. When there were changes in the search strategy syntax, the original and the updated searches should both be reported as described in Item 8.

If authors use email alerts or other methods to update searches, these methods can be briefly described by indicating the method used, the frequency of any updates, the name of the database(s) used, or other relevant information that will help readers understand how the authors conducted search updates. If deduplication methods are used as part of the search update process, these methods can be described using guidance in Item 16.

Report the methods used to update the searches in the methods section and the supplementary materials, as described above.

Item 13. Dates of searches

For each search strategy, provide the date when the last search occurred.

“A comprehensive literature search was initially run on 26 February 2017 and then rerun on 5 February 2018….” [ 127 ]

Most literature databases are regularly updated with new citations as articles are published. Citations already in the database may also be updated once new information (such as indexing terms or citing articles) is available. As an example, MEDLINE added over 900,000 indexed citations (Table ​ (Table2) 2 ) in fiscal year 2018 [ 41 ]. In addition, the information gathered by databases (such as author affiliations in MEDLINE) can change over time. Because new citations are regularly being added, systematic review guidelines recommend updating searches throughout the writing process to ensure that all relevant articles are retrieved [ 6 , 92 ].

It is necessary for authors to document the date when searches were executed, either the date the initial search was conducted, if only searched once, or the most recent date the search was rerun. This allows readers to evaluate the currency of each search and understand what literature the search could have potentially identified [ 125 ]. In addition, it supports reproducibility and updating by allowing other researchers to use date limits to view the same “slice” of the database that the original authors used or to update a systematic review by searching from the last time point searched.

Report the date of the last search of the primary information sources used in the abstract for optimal clarity for readers [ 128 ]. Report the time frame during which searches were conducted, the initial search date(s), and/or the last update search date(s) in the methods section. Report the initial and/or last update search date with each complete search strategy in the supplementary materials, as in the examples for Item 8.

Item 14. Peer review

Describe any search peer review process.

“The strategies were peer reviewed by another senior information specialist prior to execution using the PRESS Checklist [ 1 ].” [ 129 ]

Peer reviewing search strategies is an increasingly valued component of search strategy development for systematic reviews. Expert guidance recommends taking this step to help increase the robustness of the search strategy [ 6 , 74 ]. Peer reviewing (Table ​ (Table2) 2 ) searches is useful to help to guide and improve electronic search strategies. One of peer review’s main benefits is the reduction of errors [ 23 , 130 ]. Peer review may also increase the number of relevant records found for inclusion in reviews, thus improving the overall quality of the systematic review [ 131 ].

Authors should consider using the Peer Review of Electronic Search Strategies (PRESS) Guideline statement, a practice guideline for literature search peer review outlining the major components important to review and the benefits of peer reviewing searches [ 1 ]. Authors should strongly consider having the search strategy peer reviewed by an experienced searcher, information specialist, or librarian [ 1 , 131 ]. Though peer review may be conducted generally with publication of a protocol, for example, this item is designed to document search-specific peer review.

Describe the use of peer review in the methods section.

Item 15. Total records

Document the total number of records identified from each database and other information sources.

Methods section . “A total of 3251 citations were retrieved from the six databases and four grey literature websites.” [ 133 ] Flow diagram . Fig. ​ Fig.1 1 . Open in a separate window Fig. 1 “Figure 1. PRISMA 2009 flow diagram” [ 132 ]

Recording the flow of citations through the systematic review process is a key component of the PRISMA Statement [ 9 , 35 ]. It is helpful to identify how many records (Table ​ (Table2) 2 ) were identified within each database and additional source. Readers can use this information to see whether databases or expert contacts constituted the majority of the records reviewed, for example. Knowing the number of records from each source also helps with reproducibility. If a reader tries to duplicate a search from a systematic review, one would expect to retrieve nearly the same results when limiting to the timeframe in the original review. If instead, the searcher locates a drastically different number of results than reported in the original review, this can be indicative of errors in the published search [ 23 ] or major changes to a database, both of which might be reasons to update a systematic review or view the systematic review’s results with skepticism.

Report the total number of references retrieved from all sources, including updates, in the results section. Report the total number of references from each database and information source in the supplementary materials. If space permits, report the total number of references from each database in the PRISMA flow diagram [ 35 ].

Item 16. Deduplication

Describe the processes and any software used to deduplicate records from multiple database searches and other information sources.

“Duplicates were removed by the librarians (LP, PJE), using EndNote's duplicate identification strategy and then manually.” [ 134 ]

Databases contain significant overlap in content. When searching in multiple databases and additional information sources, as is necessary for a systematic review, authors often employ a variety of techniques to reduce the number of duplicates within their results prior to screening [ 135 – 138 ]. Techniques vary in their efficacy, sensitivity, and specificity (Table ​ (Table2) 2 ) [ 136 , 138 ]. Knowing which method is used enables readers to evaluate the process and understand to what extent these techniques may have removed false positive duplicates [ 138 ]. Authors should describe and cite any software or technique used, when applicable. If duplicates were removed manually, authors should include a description.

Report any deduplication method used in the methods section. The total number of references after deduplication should be reported in the PRISMA flow diagram [ 35 ].

Part 5. Discussion and conclusions

The PRISMA-S extension is designed to be used in conjunction with PRISMA 2020 [ 35 ] and PRISMA extensions including PRISMA-P for protocols [ 36 ], PRISMA-ScR for scoping reviews [ 139 ], the PRISMA Network Meta-analyses statement [ 140 ], and PRISMA-IPD for systematic reviews using individual patient data [ 141 ]. It may also be used with other reporting guidelines that relate to systematic reviews and related review types, such as RepOrting standards for Systematic Evidence Syntheses (ROSES) [ 142 ]. It provides additional guidance for systematic review teams, information specialists, librarians, and other researchers whose work contains a literature search as a component of the research methods. Though its origins are in the biomedical fields, PRISMA-S is flexible enough to be applied in all disciplines that use method-driven literature searching. Ultimately, PRISMA-S attempts to give systematic review teams a framework that helps ensure transparency and maximum reproducibility of the search component of their review.

PRISMA-S is intended to capture and provide specific guidance for reporting the most common methods used in systematic reviews today. As new methods and information sources are adopted, authors may need to adjust their reporting methods to accommodate new processes. Currently, PRISMA-S does not address using text mining or text analysis methods to create the search, for example, though this is an increasingly common way for information specialists to develop robust and objective search strategies [ 143 – 145 ]. Likewise, PRISMA-S does not require that decisions about the rationale behind choices in search terms and search construction be recorded, though this provides readers a great deal of insight. In the future, methods and rationales used to create search strategies may become more important for reproducibility.

PRISMA-S offers extensive guidance for many different types of information source and methods, many of them not described in detail in other reporting guidelines relating to literature searching. This includes detailed information on reporting study registry searches, web searches, multi-database searches, and updates. PRISMA-S can help authors report all components of their search, hopefully making the reporting process easier. As a note, PRISMA-S provides guidance on transparent reporting to authors and is not intended as a tool to either guide conduct of a systematic review or to evaluate the quality of a search or a systematic review.

The PRISMA-S checklist is available for download in Word and PDF formats from the PRISMA Statement web site [ 37 ]. The checklist should be used together with its Explanation & Elaboration documentation to provide authors with guidance for the complexities of different types of information sources and methods.

We intend to work with systematic review and information specialist organizations to broadly disseminate PRISMA-S and encourage its adoption by journals. In addition, we plan to host a series of webinars discussing how to use PRISMA-S most effectively. These webinars will also be available for later viewing and will serve as a community resource.

We hope that journal editors will recommend authors of systematic reviews and other related reviews to use PRISMA-S and submit a PRISMA-S checklist with their manuscripts. We also hope that journal editors will encourage more stringent peer review of systematic review searches to ensure greater transparency and reproducibility within the review literature.

Acknowledgements

We would like to thank all of the members of the PRISMA-S Group, which is comprised of participants in the Delphi process, consensus conference, or both. PRISMA-S Group members include Heather Blunt (Dartmouth College), Tara Brigham (Mayo Clinic in Florida), Steven Chang (La Trobe University), Justin Clark (Bond University), Aislinn Conway (BORN Ontario and CHEO Research Institute), Rachel Couban (McMaster University), Shelley de Kock (Kleijnen Systematic Reviews Ltd), Kelly Farrah (Canadian Agency for Drugs and Technologies in Health (CADTH)), Paul Fehrmann (Kent State University), Margaret Foster (Texas A & M University), Susan A. Fowler (Washington University in St. Louis), Julie Glanville (University of York), Elizabeth Harris (La Trobe University), Lilian Hoffecker (University of Colorado Denver), Jaana Isojarvi (Tampere University), David Kaunelis (Canadian Agency for Drugs and Technologies in Health (CADTH)), Hans Ket (VU Amsterdam), Paul Levay (National Institute for Health and Care Excellence (NICE)), Jennifer Lyon, Jessie McGowan (uOttawa), M. Hassan Murad (Mayo Clinic), Joey Nicholson (NYU Langone Health), Virginia Pannabecker (Virginia Tech), Robin Paynter (VA Portland Health Care System), Rachel Pinotti (Icahn School of Medicine at Mount Sinai), Amanda Ross-White (Queens University), Margaret Sampson (CHEO), Tracy Shields (Naval Medical Center Portsmouth), Adrienne Stevens (Ottawa Hospital Research Institute), Anthea Sutton (University of Sheffield), Elizabeth Weinfurter (University of Minnesota), Kath Wright (University of York), and Sarah Young (Carnegie Mellon University). We would also like to thank Kate Nyhan (Yale University), Katharina Gronostay (IQWiG), the many others who contributed to the PRISMA-S project anonymously or as draft reviewers, and our peer reviewers. We would like to give special thanks to the late Douglas G. Altman (D.G.A.; University of Oxford) for his support and guidance, and the co-chairs of the Medical Library Association’s Systematic Reviews SIG in 2016, Margaret Foster (Texas A & M University) and Susan Fowler (Washington University in St. Louis), for allowing us to use one of their meeting times for the consensus conference.

Abbreviations

Authors’ contributions.

M.L.R. conceived and designed the study, conducted the thematic and quantitative analyses, curated the data, drafted the manuscript, and reviewed and edited the manuscript. M.L.R. is the guarantor. J.B.K. and S.K. contributed to the design of the study, developed the literature search strategies, contributed to the thematic content analyses, drafted a portion of the Elaboration & Explanation, and reviewed and edited the manuscript. J.B.K. developed the survey instrument. M.L.R., J.B.K., and S.K. hosted and organized the consensus conference. S.W. and A.P.A. contributed to the thematic content analysis, drafted a portion of the Elaboration & Explanation, and reviewed and edited the manuscript. S.W. supervised the draft revision documentation. D.M. helped conceive and design the study. M.J.P. provided substantive review and editing of the checklist, Explanation & Elaboration, and final manuscript. The author (s) read and approved the final manuscript.

Melissa Rethlefsen was funded in part by the University of Utah’s Center for Clinical and Translational Science under the National Center for Advancing Translational Sciences of the National Institutes of Health Award Number UL1TR002538 in 2017–2018 . The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

Shona Kirtley was funded by the Cancer Research UK (grant C49297/A27294). The funder had no role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript. The views expressed are those of the authors and not necessarily those of the Cancer Research UK.

Matthew Page is supported by an Australian Research Council Discovery Early Career Researcher Award (DE200101618).

David Moher is supported by a University Research Chair, University of Ottawa, Ottawa, Canada.

The consensus conference was sponsored by the Systematic Reviews SIG of the Medical Library Association. There was no specific funding associated with this event.

Availability of data and materials

Ethics approval and consent to participate.

This study was declared exempt by the University of Utah Institutional Review Board (IRB_00088425). Consent was received from all survey participants.

Consent for publication

Not applicable

Competing interests

The authors declare no competing interests. MJP and DM are leading the PRISMA 2020 update.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Contributor Information

Melissa L. Rethlefsen, Email: moc.liamg@nesfelhterlm .

Shona Kirtley, Email: [email protected] .

Siw Waffenschmidt, Email: [email protected] .

Ana Patricia Ayala, Email: [email protected] .

David Moher, Email: ac.irho@rehomd .

Jonathan B. Koffel, Email: ude.nmu@leffokbj .

PRISMA-S Group: Heather Blunt , Tara Brigham , Steven Chang , Justin Clark , Aislinn Conway , Rachel Couban , Shelley de Kock , Kelly Farrah , Paul Fehrmann , Margaret Foster , Susan A. Fowler , Julie Glanville , Elizabeth Harris , Lilian Hoffecker , Jaana Isojarvi , David Kaunelis , Hans Ket , Paul Levay , Jennifer Lyon , Jessie McGowan , M. Hassan Murad , Joey Nicholson , Virginia Pannabecker , Robin Paynter , Rachel Pinotti , Amanda Ross-White , Margaret Sampson , Tracy Shields , Adrienne Stevens , Anthea Sutton , Elizabeth Weinfurter , Kath Wright , and Sarah Young

SYSTEMATIC REVIEW article

A systematic review of empirical studies incorporating english movies as pedagogic aids in english language classroom provisionally accepted.

  • 1 Department of English, School of Social Sciences and Languages, Vellore Institute of Technology, Vellore, India., India
  • 2 Research Scholar, Department of English, School of Social Sciences and Languages, Vellore Institute of Technology, Vellore, Tamil Nadu, India., India
  • 3 Department of English, School of Social Sciences and Languages, VIT University, India

The final, formatted version of the article will be published soon.

The use of movie as an audio-visual multimodal tool has been extensively researched, and the studies prove that they play a vital role in enhancing communicative competence. Incorporating authentic materials like movies, television series, podcasts, social media, etc. into language learning serves as a valuable resource for the learners, for it exposes them to both official and vernacular language. The current study aims to systematically analyse the preceding studies that conjoined English movies into the curriculum to teach English. It also examines and evaluates the empirical research that various researchers conducted from 2000 to 2023. The articles were primarily sourced from prominent academic databases such as ProQuest, ScienceDirect, Scopus, Web of Science, and Google Scholar. Inclusion and exclusion criteria were applied in screening the 921 sources, of which 23 empirical studies were eligible for the review as a result of a three-stage data extraction process as shown in the "Preferred Reporting Items for Systematic Reviews and Meta Analyses" (PRISMA) chart. The extraction of data from the review encompasses an overview of the empirical studies, methodologies, participants, and interventions. The extracts were systematically analysed using the software's End Note and Covidence. The analysis of the existing literature and experimental data substantiates that teaching and learning English as a second or foreign language using movies as teaching aids exhibit promising prospects for enhancing English language proficiency. The findings of the study reveal different genres of movies that aid the facilitator in producing effective instruction materials with clearly defined objectives and guided activities. It is also observed that the learners have a positive experience with long-term learning benefits.

Keywords: EndNote, Covidence, Audio-visual multimodal aids, Communicative competence, Systematic review, Teaching Supplements

Received: 13 Feb 2024; Accepted: 16 May 2024.

Copyright: © 2024 K and S.N.S. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence: Dr. Gandhimathi S.N.S, Department of English, School of Social Sciences and Languages, VIT University, Vellore, India

People also looked at

IMAGES

  1. systematic literature review using prisma a step by step guide

    prisma dalam literature review

  2. systematic literature review using prisma a step by step guide

    prisma dalam literature review

  3. systematic literature review using prisma a step by step guide

    prisma dalam literature review

  4. Steps for systematic literature review (PRISMA method) Source: Scheme

    prisma dalam literature review

  5. PRISMA Diagram of Literature Review Process

    prisma dalam literature review

  6. systematic literature review using prisma a step by step guide

    prisma dalam literature review

VIDEO

  1. IQRA #14: Systematic Literature Review With PRISMA

  2. BRSD (Prisma)| Serunya Belajar Bangun Prisma melalui Kemasan Produk Makanan

  3. Systematic Literature Review by Dr Farooq

  4. Development of Prism

  5. PRISMA

  6. Prisma and Cosma Visions

COMMENTS

  1. PRISMA 2020 explanation and elaboration: updated guidance and exemplars for reporting systematic reviews

    Separate guidance for items that should be described in protocols of systematic reviews is available (PRISMA-P 2015 statement). 21 22. PRISMA 2020 explanation and elaboration. PRISMA 2020 is ... Terms such as "review," "literature review," "evidence synthesis," or "knowledge synthesis" are not recommended because they do not ...

  2. PRISMA statement

    Here you can access information about the PRISMA reporting guidelines, which are designed to help authors transparently report why their systematic review was done, what methods they used, and what they found. The main PRISMA reporting guideline (the PRISMA 2020 statement) primarily provides guidance for the reporting of systematic reviews ...

  3. PRISMA Flow Diagram

    PRISMA is an evidence-based minimum set of items for reporting in systematic reviews and meta-analyses. PRISMA focuses on the reporting of reviews evaluating randomized trials, but can also be used as a basis for reporting systematic reviews of other types of research, particularly evaluations of interventions.

  4. How to properly use the PRISMA Statement

    It has been more than a decade since the original publication of the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) Statement [], and it has become one of the most cited reporting guidelines in biomedical literature [2, 3].Since its publication, multiple extensions of the PRISMA Statement have been published concomitant with the advancement of knowledge synthesis ...

  5. The PRISMA 2020 statement: an updated guideline for reporting ...

    The Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) statement, published in 2009, was designed to help systematic reviewers transparently report why the review was done, what the authors did, and what they found. Over the past decade, advances in systematic review methodology and terminology have necessitated an update to the guideline. The PRISMA 2020 statement ...

  6. The PRISMA 2020 statement: An updated guideline for reporting

    2. Development of PRISMA 2020. A complete description of the methods used to develop PRISMA 2020 is available elsewhere [35].We identified PRISMA 2009 items that were often reported incompletely by examining the results of studies investigating the transparency of reporting of published reviews [17, 21, 36, 37].We identified possible modifications to the PRISMA 2009 statement by reviewing 60 ...

  7. PRISMA 2020 statement: What's new and the importance of reporting

    The Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) statement, first published in 2009 [1], was developed in an attempt to increase the clarity, transparency, quality and value of these reports [2].The 27-item checklist and four-phase flow diagram have become the hallmark of academic rigour in the publication of systematic reviews and meta-analyses, having been ...

  8. Understanding PRISMA 2020

    The PRISMA 2020 (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) is a robust framework designed to guide researchers in conducting systematic reviews.It goes beyond the scope of regular literature reviews, which are essentially summaries of existing research on a particular topic. In contrast, a systematic review is a comprehensive and methodical exploration of the ...

  9. Preferred Reporting Items for Systematic Reviews and Meta-Analyses

    The PRISMA flow diagram, depicting the flow of information through the different phases of a systematic review. PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) is an evidence-based minimum set of items aimed at helping scientific authors to report a wide array of systematic reviews and meta-analyses, primarily used to assess the benefits and harms of a health care ...

  10. A step-by-step process

    A step-by-step process. Using the PRISMA 2020 (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines involves a step-by-step process to ensure that your systematic review or meta-analysis is reported transparently and comprehensively. Below are the key steps to follow when using PRISMA 2020:

  11. PDF Systematic Review dengan PRISMA (Preferred Reporting Items for

    The PRISMA statement for reporting systematic reviews and metaanalyses of studies that evaluate health care interventions: explanation and elaboration. Journal Clinical Epidemiology, 62(10), 1-34. Siswanto. (201o). Systematic review sebagai metode penelitian untuk mensintesis hasil-hasil penelitian (sebuah pengantar).

  12. PRISMA Framework for Systematic Literature Review

    From: Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, et al.The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ ...

  13. The PRISMA 2020 statement: an updated guideline for reporting

    The PRISMA 2020 statement replaces the 2009 statement and includes new reporting guidance that reflects advances in methods to identify, select, appraise, and synthesise studies. The structure and presentation of the items have been modified to facilitate implementation. In this article, we present the PRISMA 2020 27-item checklist, an expanded ...

  14. The PRISMA 2020 statement: an updated guideline ...

    The Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) statement published in 2009 (hereafter referred to as PRISMA 2009) [4,5,6,7,8,9,10] is a reporting guideline designed to address poor reporting of systematic reviews [].The PRISMA 2009 statement comprised a checklist of 27 items recommended for reporting in systematic reviews and an "explanation and elaboration ...

  15. PRISMA for Review of Management Literature

    The chapter further points out the limitations of PRISMA in the review of management literature and suggests measures to overcome that. This piece of literature introduces a reader to the basics of a systematic review using PRISMA as an instrument. One of the significant contributions is to delineate a seven-step strategy to attain ...

  16. Literature Review Using PRISMA

    Literature Review Using PRISMA. PRISMA 2009 Checklist: The Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) is a 27-item checklist used to improve transparency in systematic reviews. These items cover all aspects of the manuscript, including title, abstract, introduction, methods, results, discussion, and funding. ...

  17. A Guide for Systematic Reviews: PRISMA

    The PRISMA guidelines consist of a four-phase flow diagram and a 27-item checklist. The flow diagram describes the identification, screening, eligibility and inclusion criteria of the reports that fall under the scope of a review. The checklist includes a 27-item recommendation list on topics such as title, abstract, introduction, methods ...

  18. (PDF) A Guide for Systematic Reviews: PRISMA

    views and Meta-Analyses (PRISMA) group, which. mainly consists of Cochrane authors, has developed. the PRISMA guidelines in 2009 (6). A systematic. review will extensively scan all reports ...

  19. PRISMA (Preferred Reporting Items for Systematic Reviews and Meta

    PRISMA merupakan alat dan panduan yang digunakan untuk melakukan penilaian terhadap sebuah systematic reviews dan atau meta analysis. PRISMA membantu para penulis dan peneliti dalam menyusun sebuah systematic review dan meta analysis yang berkualitas. PRISMA tersusun atas checklist yang berisikan panduan item apa saja yang harus ada dan ...

  20. How to do PRISMA Analysis for the literature reviews?

    Click on the type of review you're doing (e.g., a scoping review). Then you'll get a page that has a form you can just fill in for that type of PRISMA report.

  21. Carbon vs. Titanium Nails in the Treatment of Impending and ...

    The literature review was conducted between January 2000 and March 2024, utilizing a rigorous and systematic methodology in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) guidelines , which is illustrated in the study review progression flowchart .

  22. Factors influencing ESG performance: A bibliometric analysis

    - This study embarks on a comprehensive journey through the factors influencing Environmental, Social, and Governance (ESG) performance by employing a two-fold approach: a bibliometric analysis and a systematic literature review. Scopus, a prominent scientific database, was utilized to examine publications spanning the years 2014-2023. The article selection process followed the PRISMA ...

  23. Frontiers

    We focused on literature related to academic advising for non-traditional students in formal HE, restricting our search to both empirical and non-empirical articles published in peer-reviewed journals between 2010 and 2022. ... Employing Arksey and O'Malley's scoping review method and the PRISMA-ScR Checklist, we searched four databases ...

  24. PRISMA-S: an extension to the PRISMA Statement for Reporting Literature

    The most commonly used reporting guidance for systematic reviews, which covers the literature search component, is the Preferred Reporting Items for Systematic reviews and Meta-Analyses Statement, or PRISMA Statement . The 2009 PRISMA Statement checklist included three items related to literature search reporting, items 7, 8, and 17:

  25. Sensors

    Stress is a natural yet potentially harmful aspect of human life, necessitating effective management, particularly during overwhelming experiences. This paper presents a scoping review of personalized stress detection models using wearable technology. Employing the PRISMA-ScR framework for rigorous methodological structuring, we systematically analyzed literature from key databases including ...

  26. Frontiers

    The use of movie as an audio-visual multimodal tool has been extensively researched, and the studies prove that they play a vital role in enhancing communicative competence. Incorporating authentic materials like movies, television series, podcasts, social media, etc. into language learning serves as a valuable resource for the learners, for it exposes them to both official and vernacular ...

  27. Adaptation to life after sport for retired athletes: A scoping review

    Retirement from sport is a life transition that has significant implications for athletes' physical and mental health, as well as their social and professional development. Although extensive work has been done to review the retirement experiences of athletes, relatively less work has been done to examine and reflect on this expansive body of literature with a pragmatic aim of deciding what ...