Skip to main content

Research Repository

Advanced Search

Reporting Systematic Reviews: Some Lessons from a Tertiary Study

Reporting Systematic Reviews: Some Lessons from a Tertiary Study Thumbnail


Abstract

Context: Many of the systematic reviews published in software engineering are related to research or methodological issues and hence are unlikely to be of direct benefit to practitioners or teachers. Those that are relevant to practice and teaching need to be presented in a form that makes their findings usable with minimum interpretation.

Objective: We have examined a sample of the many systematic reviews that have been published over a period of six years, in order to assess how well these are reported and identify useful lessons about how this might be done.

Method: We undertook a tertiary study, performing a systematic review of systematic reviews. Our study found 178 systematic reviews published in a set of major software engineering journals over the period 2010-2015. Of these, 37 provided recommendations or conclusions of relevance to education and/or practice and we used the DARE criteria as well as other attributes related to the systematic review process to analyse how well they were reported.

Results: We have derived a set of 12 'lessons' that could help authors with reporting the outcomes of a systematic review in software engineering. We also provide an associated checklist for use by journal and conference referees.

Conclusion: There are several areas where better reporting is needed, including quality assessment, synthesis, and the procedures followed by the reviewers. Researchers, practitioners, teachers and journal referees would all benefit from better reporting of systematic reviews, both for clarity and also for establishing the provenance of any findings.

Acceptance Date Oct 27, 2017
Publication Date Mar 1, 2018
Journal Information and Software Technology
Print ISSN 0950-5849
Publisher Elsevier
Pages 62-74
DOI https://doi.org/10.1016/j.infsof.2017.10.017
Keywords Systematic Review; Reporting Quality; Provenance of findings
Publisher URL http://www.sciencedirect.com/science/article/pii/S0950584916303548

Files




Downloadable Citations