We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure [email protected]
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Systematic reviews (SRs) are usually conducted by a highly specialized group of researchers. The routine involvement of methodological experts is a core methodological recommendation. The present commentary describes the qualifications required for information specialists and statisticians involved in SRs, as well as their tasks, the methodological challenges they face, and potential future areas of involvement.
Tasks and qualifications
Information specialists select the information sources, develop search strategies, conduct the searches, and report the results. Statisticians select the methods for evidence synthesis, assess the risk of bias, and interpret the results. The minimum requirements for their involvement in SRs are a suitable university degree (e.g., in statistics or librarian/information science or an equivalent degree), methodological and content expertise, and several years of experience.
Key arguments
The complexity of conducting SRs has greatly increased due to a massive rise in the amount of available evidence and the number and complexity of SR methods, largely statistical and information retrieval methods. Additional challenges exist in the actual conduct of an SR, such as judging how complex the research question could become and what hurdles could arise during the course of the project.
Conclusion
SRs are becoming more and more complex to conduct and information specialists and statisticians should routinely be involved right from the start of the SR. This increases the trustworthiness of SRs as the basis for reliable, unbiased and reproducible health policy, and clinical decision making.
Peer review of searches is a process whereby both the search strategies and the search process description are reviewed, ideally using an evidence-based checklist.
Rationale
As the search strategy underpins any well-conducted evidence synthesis, its quality could affect the final result. Evidence shows, however, that search strategies are prone to error.
Findings
There is increasing awareness and use of the PRESS Evidence-Based Checklist and peer review of search strategies, at the outset of evidence syntheses, prior to the searches being run, and this is now recommended by a number of evidence synthesis organizations.
Recommendations and conclusions
Searches for evidence syntheses should be peer reviewed by a suitably qualified and experienced librarian or information specialist after being designed, ideally, by another suitably qualified and experienced librarian or information specialist. Peer review of searches should take place at two important stages in the evidence synthesis process; at the outset of the project prior to the searches being run and at the prepublication stage. There is little empirical evidence, however, to support the effectiveness of peer review of searches. Further research is required to assess this. Those wishing to stay up to date with the latest developments in information retrieval, including peer review of searches, should consult the SuRe Info resource (http://www.sure-info.org), which seeks to help information specialists and others by providing easy access to the findings from current information retrieval methods research and thus support more research-based information retrieval practice.
Solutions like crowd screening and machine learning can assist systematic reviewers with heavy screening burdens but require training sets containing a mix of eligible and ineligible studies. This study explores using PubMed's Best Match algorithm to create small training sets containing at least five relevant studies.
Methods
Six systematic reviews were examined retrospectively. MEDLINE searches were converted and run in PubMed. The ranking of included studies was studied under both Best Match and Most Recent sort conditions.
Results
Retrieval sizes for the systematic reviews ranged from 151 to 5,406 records and the numbers of relevant records ranged from 8 to 763. The median ranking of relevant records was higher in Best Match for all six reviews, when compared with Most Recent sort. Best Match placed a total of thirty relevant records in the first fifty, at least one for each systematic review. Most Recent sorting placed only ten relevant records in the first fifty. Best Match sorting outperformed Most Recent in all cases and placed five or more relevant records in the first fifty in three of six cases.
Discussion
Using a predetermined set size such as fifty may not provide enough true positives for an effective systematic review training set. However, screening PubMed records ranked by Best Match and continuing until the desired number of true positives are identified is efficient and effective.
Conclusions
The Best Match sort in PubMed improves the ranking and increases the proportion of relevant records in the first fifty records relative to sorting by recency.
Recommend this
Email your librarian or administrator to recommend adding this to your organisation's collection.