Hostname: page-component-cd9895bd7-dk4vv Total loading time: 0 Render date: 2024-12-24T20:16:00.141Z Has data issue: false hasContentIssue false

New challenges and solutions for information retrieval: information specialists as “bricklayers” in the health technology assessment framework

Published online by Cambridge University Press:  24 June 2021

Siw Waffenschmidt*
Affiliation:
Institute for Quality and Efficiency in Health Care (IQWiG), Cologne, Germany
David Kaunelis
Affiliation:
Canadian Agency for Drugs and Technologies in Health (CADTH), Ottawa, Ontario, Canada
Ingrid Harboe
Affiliation:
Norwegian Institute of Public Health (NIPH), Oslo, Norway
*
Author for correspondence: Siw Waffenschmidt, E-mail: [email protected]
Rights & Permissions [Opens in a new window]

Abstract

We are pleased to present this information retrieval issue. Our goal is to highlight the variety of tasks and organizational structures of information retrieval within heath technology assessment (HTA). This special issue was planned and organized by the HTAi Information Retrieval Group (IRG). The choice of publications in this issue reflects the versatility and high competence in information retrieval methodology that is needed as an information specialist. Furthermore, it provides insights into the daily challenges faced by information specialists.

Type
Editorial
Copyright
Copyright © The Author(s), 2021. Published by Cambridge University Press

We are pleased to present this information retrieval issue. Our goal is to highlight the variety of tasks and organizational structures of information retrieval within health technology assessment (HTA). This special issue was planned and organized by the HTAi Information Retrieval Group (IRG). IRG is the first interest group formed within HTAi (1997) and is among the most active groups.

Information specialists can be seen as the ones who build the walls of the entire structure in the HTA building. The ones who pour the concrete into the foundations—without us, the foundation of the building would be missing. Although the call for routine involvement of information specialists in evidence synthesis is supported by research findings (Reference Golder, Loke and McIntosh1Reference Zhang, Sampson and McGowan5), our work is often not recognized. Due to a lack of resources or awareness, we are still insufficiently involved. By embedment in systematic review teams, we play a pivotal role in helping to provide the evidence base, avoid bias, and reduce research waste.

Many changes have taken place in the field of information retrieval over the past few years. One might assume that the work has become easier due to digitalization, but the opposite seems to be the case; it has become more complex and technical, and a highly specialized task. In addition, the work environment is challenging. Information specialists work in a team only in exceptional cases, for example, in large HTA agencies such as the National Institute for Health and Care Excellence (NICE) or The Canadian Agency for Drugs and Technologies in Health (CADTH). As we usually work alone, it is important to conduct evidence-based information retrieval and to collaborate with peers, which has resulted in the establishment of strong information specialist networks.

These challenges are displayed by the variety of this issue's articles, which feature current and classical information retrieval topics, such as predefined search filters for the new and challenging topic of health apps by Ayiku et al. (Reference Ayiku, Hudson, Glover, Walsh, Adams and Deane6). The NICE health apps search filters for MEDLINE and EMBASE achieve 98.6 percent recall and were tested in nearly 700 relevant references. This is a successful example of how information specialists provide the much-needed evidence for new methodological challenges that help researchers, including noninformation specialists, to identify evidence efficiently, and more importantly, save resources and costs.

Another task of information specialists is the peer review of search strategies using the PRESS checklist. Lefebvre and Duffy (Reference Lefebvre and Duffy7) summarize the current evidence on peer review of searches and provide an overview on how to use the checklist. The reporting of search methods using tools such as the PRISMA checklist and the ROBIS tool was evaluated by De Kock et al. (Reference de Kock, Stirk, Ross, Duffy, Noake and Misso8). They concluded that 90 percent of systematic reviews are failing to report search methods adequately and also failing to conduct comprehensive searches using a wide range of resources. This finding supports the call for routine involvement of information specialists.

This issue also contains a timely methodological evaluation of new database features - the PubMed “Best Match” sorting option evaluated by Sampson et al. (Reference Sampson, Nama, O'Hearn, Murto, Nasr and Katz9). This is a good example of the methodological support offered by information specialists: new database features or techniques are evaluated and can then be applied according to the evidence. It also exemplifies why information specialist networks are needed and appreciated, because new database features or changes in interfaces – such as the changes in PubMed's sorting options—happen often and unexpectedly. In addition, two articles show how diverse the workplace of information specialists can be. Stadig and Svanberg (Reference Stadig and Svanberg10) explain how information specialists are involved in hospital-based HTA in Sweden, whereas in a broader European perspective, Waffenschmidt et al. (Reference Waffenschmidt, van Amsterdam-Lunze, Gomez, Rehrmann, Harboe and Hausner11) present information specialists’ involvement and collaboration within the European HTA network EUnetHTA and suggest ideas for future projects.

Last but not least, Isojärvi and Glanville (Reference Isojärvi and Glanville12) describe SuRe Info, a freely available source for information specialists developed with the help of HTAi. Summarized Research in Information Retrieval for HTA (SuRe Info) is a Web site that summarizes research-based information on effective and efficient evidence identification for the different aspects of HTA and evidence synthesis. Produced by the HTAi IRG, it is a unique resource. This resource is managed and coordinated by an editorial team of information specialists implemented by the IRG. Nearly thirty authors from eight different countries volunteer frequently to update the seventeen chapters.

The choice of publications in this issue reflects the versatility and high competence in information retrieval methodology that is needed as an information specialist. Furthermore, it provides insights into the daily challenges faced by information specialists.

We would like to thank the members of the IRG as well as other information specialists in the scientific community for their support. Without their contribution toward proposals for articles and their peer reviews, it would not have been possible to produce this special issue.

Acknowledgments

The authors thank Natalie McGauran for editorial support.

Financial support

This research received no specific funding from any agency, commercial, or not-for-profit sectors.

References

Golder, S, Loke, Y, McIntosh, HM. Poor reporting and inadequate searches were apparent in systematic reviews of adverse effects. J Clin Epidemiol. 2008;61:440–8.CrossRefGoogle ScholarPubMed
Koffel, JB. Use of recommended search strategies in systematic reviews and the impact of librarian involvement: A cross-sectional survey of recent authors. PLoS ONE. 2015;10:e0125931.CrossRefGoogle ScholarPubMed
Meert, D, Torabi, N, Costella, J. Impact of librarians on reporting of the literature searching component of pediatric systematic reviews. J Med Libr Assoc. 2016;104:267–77.CrossRefGoogle ScholarPubMed
Rethlefsen, ML, Farrell, AM, Osterhaus Trzasko, LC, Brigham, TJ. Librarian co-authors correlated with higher quality reported search strategies in general internal medicine systematic reviews. J Clin Epidemiol. 2015;68:617–26.CrossRefGoogle ScholarPubMed
Zhang, L, Sampson, M, McGowan, J. Reporting the role of the expert searcher in Cochrane reviews. Evid Based Libr Inf Pract. 2006;1:316.CrossRefGoogle Scholar
Ayiku, L, Hudson, T, Glover, S, Walsh, N, Adams, R, Deane, J, et al. The NICE MEDLINE and Embase (Ovid) health apps search filters: Development of validated filters to retrieve evidence about health apps. Int J Technol Assess Health Care. 2020:17.Google ScholarPubMed
Lefebvre, C, Duffy, S. Peer review of searches for studies for health technology assessments, systematic reviews and other evidence synthesis. Int J Technol Assess Health Care. 2021.CrossRefGoogle Scholar
de Kock, S, Stirk, L, Ross, J, Duffy, S, Noake, C, Misso, K. Systematic review search methods evaluated using the preferred reporting of items for systematic reviews and meta-analyses and the risk of bias n systematic reviews tool. Int J Technol Assess Health Care. 2020:15.Google Scholar
Sampson, M, Nama, N, O'Hearn, K, Murto, K, Nasr, A, Katz, SL, et al. Creating enriched training sets of eligible studies for large systematic reviews: The utility of PubMed's Best Match algorithm. Int J Technol Assess Health Care. 2020:16.Google ScholarPubMed
Stadig, I, Svanberg, T. Overview of information retrieval in a hospital-based health technology assessment center in a Swedish region. Int J Technol Assess Health Care. 2021;37:e52.CrossRefGoogle Scholar
Waffenschmidt, S, van Amsterdam-Lunze, M, Gomez, RI, Rehrmann, M, Harboe, I, Hausner, E. Information specialist collaboration in Europe: Collaborative methods, processes, and infrastructure through EUnetHTA. Int J Technol Assess Health Care. 2020:16.Google ScholarPubMed
Isojärvi, J, Glanville, J. Evidence-based searching for health technology assessment: Keeping up to date with SuRe Info. Int J Technol Assess Health Care. 2021;37:e51.CrossRefGoogle ScholarPubMed