Hostname: page-component-cd9895bd7-hc48f Total loading time: 0 Render date: 2024-12-23T08:31:59.347Z Has data issue: false hasContentIssue false

Transparency in Practice in Qualitative Research

Published online by Cambridge University Press:  22 December 2020

Diana Kapiszewski
Affiliation:
Georgetown University
Sebastian Karcher
Affiliation:
Syracuse University
Rights & Permissions [Opens in a new window]

Abstract

Type
Opening Political Science
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
Copyright
© The Author(s), 2020. Published by Cambridge University Press on behalf of the American Political Science Association

The discipline of political science has been engaged in vibrant debate about research transparency for more than three decades. In the abstract, augmenting transparency implies the same steps in all types of political science scholarship: making the empirical information that underpins our work meaningfully accessible; elucidating how that information was collected or generated; and describing how the information was interpreted and/or analyzed.Footnote 1 Nonetheless, the way in which transparency is pursued—and the type and difficulty of the challenges that pursuing it presents—vary across research traditions.

Scholars who collect, generate, and draw on qualitative evidence in their work are relative newcomers to the debate about transparency. Their more vigorous engagement during the past decade has brought important new voices and viewpoints to the conversation and raised new issues and questions. In particular, the recently completed Qualitative Transparency Deliberations (QTD; see www.qualtd.net), directed by Tim Büthe and Alan Jacobs, represent a crucial step forward. Called for during the business meeting of the Qualitative and Multi-Method Research (QMMR) section of the American Political Science Association (APSA) during the 2015 APSA conference, the QTD involved 13 working groups (and hundreds of political scientists beyond those groups), and ultimately produced 13 thoughtful final reports.Footnote 2 Among their many lessons, the QTD demonstrated that practices for making scholarship more transparent—and the challenges that doing so poses—vary among forms of qualitative inquiry.Footnote 3

This article first briefly reviews the literature on transparency in qualitative inquiry, describing what we see as its evolution. Next, we highlight some considerations that shape how and how much researchers pursue transparency. We then describe a set of exciting, creative techniques that scholars are developing and pioneering to enhance the transparency of qualitative research. These strategies can help scholars to illustrate research practices, clarify the empirical underpinnings of their work, and facilitate its evaluation, as well as balance the various considerations that bear on achieving transparency. The diversity of these emerging strategies demonstrates that transparency is not an all-or-nothing proposition and can be pursued in many different ways. The conclusion summarizes and offers thoughts on the way forward.

STATE OF THE DEBATE

During the past decade, political scientists who generate, collect, interpret, analyze, and publish scholarly work based on qualitative data have engaged in energetic dialogue about research transparency.Footnote 4 One way to characterize the arc of the debate is to suggest that it began with thoughtful consideration of “whether” scholars who use qualitative data and methods can and should seek to make their work more transparent, and then progressed to the question of “what” information scholars should share about data production and analysis, and what data they should share, in pursuing transparency. The debate only recently has begun to consider “how” transparency can be achieved—that is, which concrete techniques and strategies scholars can use to augment the transparency of their work.

The debate only recently has begun to consider “how” transparency can be achieved—that is, which concrete techniques and strategies scholars can use to augment the transparency of their work.

This section presents a very general overview of the literature that addresses the first two of these questions. We take up the third question in the third and fourth sections of the article, first discussing some challenges related to achieving transparency and then offering a series of strategies for doing so. The literature on the first two questions is rich and extensive: it includes interventions by scholars in various academic disciplines (e.g., political science, education, health, and sociology); by practitioners in information schools, university libraries, and data repositories; and by scholars from around the globe. Our review, by necessity, is incomplete given space limitations.Footnote 5

The question of whether political scientists who engage in qualitative research can and should make their work more transparent has mainly played out (in written form) in a series of journal symposia published since 2010.Footnote 6 In the opening articles (Büthe and Jacobs Reference Büthe and Jacobs2015; Golder and Golder Reference Golder and Golder2016) and in contributions to these symposia, scholars have discussed the intellectual benefits—for producers and consumers of scholarship based on qualitative data and methods—of making such research more transparent, as well as the epistemological, ethical, legal, and practical challenges of doing so. In addition to the contributions to these symposia, several stand-alone articles written by scholars from various disciplines have addressed these questions, with some advocating for transparency (e.g., Corti Reference Corti2006; Elman, Kapiszewski, and Lupia Reference Elman, Kapiszewski and Lupia2018; Gleditsch and Janz Reference Gleditsch and Janz2016; Miguel et al. Reference Miguel, Camerer, Casey, Cohen, Esterling, Gerber and Glennerster2014) and others registering concerns (e.g., Monroe Reference Monroe2018; Schwartz-Shea and Yanow Reference Schwartz-Shea and Yanow2016; Tripp Reference Tripp2018; Tsai et al. Reference Tsai, Kohrt, Matthews, Betancourt, Lee, Papachristos, Weiser and Dworkin2016).

The issue of what information about data generation and analysis and which data scholars should share likewise has been considered in a range of written work. The QTD advanced the debate in productive ways, offering novel insights on the meaning and “content” of transparency. In their overview article, Jacobs et al. (Reference Jacobs, Tim Büthe, Arjona, Arriola, Bennett and Björkman2019, 25–27) provided useful lists of information that can be shared to increase transparency, and various reports (e.g., Elkins, Spitzer, and Tallberg Reference Elkins, Spitzer and Tallberg2019; Schneider, Vis, and Koivu Reference Schneider, Vis and Koivu2019, and others mentioned elsewhere in this article) offered guidance about what can be shared at low cost, low risk, and/or efficiently to achieve transparency. Additional examples of work considering these questions include Barnes and Weller’s (Reference Barnes and Weller2017) discussion of what information can elucidate analytic processes in process-tracing work and Tuval-Mashiach’s (Reference Tuval-Mashiach2017, 130–34) suggestion that scholars answer three reflective questions in pursuit of transparency (i.e., what they did and how and why they did it).

CONSIDERATIONS IN MAKING RESEARCH TRANSPARENT

All scholars weigh and balance various factors and pressures as they consider how to make their work more transparent. Among these are two obligations whose fulfilment trumps other aims: pursuing transparency ethically (see, e.g., Carusi and Jirotka Reference Carusi and Jirotka2009; Chauvette, Schick-Makaroff, and Molzahn Reference Chauvette, Schick-Makaroff and Molzahn2019) and legally. For example, scholars must obtain the informed consent of the people they involve in their research in order to ethically share the information that those “human participants” convey. Scholars and participants must reach unambiguous agreement—ideally, through a consultative process—on what, when, where, how, and with whom information can be shared

Scholars and participants must reach unambiguous agreement—ideally, through a consultative process—on what, when, where, how, and with whom information can be shared…

, and scholars must adhere strictly to those agreements, without compromise.Footnote 7 Likewise, scholars cannot legally share work that is under copyright if permission cannot be secured. These issues are discussed in more detail in the next section.

Scholars also consider other factors when deciding how to pursue research transparency. These include (but are not limited to) intellectual considerations (i.e., how to pursue transparency in ways that will showcase the rigor and power of research; see, e.g., Elman and Lupia Reference Elman and Lupia2016, 51; Fujii Reference Fujii2016, 25–26); resource considerations and opportunity costs (i.e., how much time and money to spend on pursuing transparency and what the cost of not spending those resources elsewhere will be; see, e.g., Saunders Reference Saunders2014, 694–97); and expositional considerations (i.e., how practically to pursue transparency while ensuring that the text remains readable and suitable for standard publication formats; see, e.g., Moravcsik Reference Moravcsik2012, 36).

TRANSPARENCY IN PRACTICE

This section considers the question of “how” transparency can be achieved in qualitative inquiry. We outline a set of techniques that scholars can use to ethically and legally increase their work’s transparency while balancing other considerations relevant to their situation and project. The discussion draws on the literature on, and our experiences working with scholars pursuing, transparency in qualitative research. Exciting and promising techniques beyond those discussed are surely being developed and used. Scholars should consider which strategies to use before beginning research because their choices bear on how they track the research process as they carry out their work.

Preregistration

Preregistration entails specifying a research project’s rationale, hypotheses, design, and plan for data generation and analysis before initiating data collection. Interest in preregistration for qualitative work has been increasing (Haven and Van Grootel Reference Haven and Van Grootel2019; Jacobs Reference Jacobs, Elman, Gerring and Mahoney2020; Kern and Gleditsch Reference Kern and Gleditsch2017; Piñeiro and Rosenblatt Reference Piñeiro and Rosenblatt2016).Footnote 8 There are good reasons to be skeptical of the need for and utility of preregistration in qualitative research given the often exploratory nature of such work (Haven and Van Grootel Reference Haven and Van Grootel2019, 6–8). Nonetheless, having a timestamped record of the original research and analysis plan, as well as changes made during the research process, can help scholars to stay on track and to carefully consider and justify (for themselves and their readers) changes to their design. Creating and maintaining such a plan may be a relatively low-cost way for scholars to demonstrate the rigor of their research without overloading a publication with methodological description. A pioneering example of a preregistered qualitative case study is presented in Christensen, Hartman, and Samii (Reference Christensen, Hartman and Samii2019, 26–31), who used case-study evidence to validate (and extend) quantitative findings.

Methodological Appendices

Methodological appendices—that is, supplementary material that discusses how an author collected, generated, and analyzed data—can advance transparency in various types of research and can take several forms.Footnote 9 Creating these appendices allows researchers to augment the transparency of their work without affecting its readability and length—even when sharing the data underpinning the work is not possible. Journals rarely place length limitations on these appendices,Footnote 10 affording scholars great latitude in describing their research process. However, appendices can be difficult to locate and the discussions difficult to connect to particular arguments in the text.

For instance, scholars who engage in ethnography or interpretive research can provide an extended discussion of these issues in a stand-alone document that supplements the space-constrained text of research articles (see Lester and Anders Reference Lester and Anders2018, Reyes Reference Reyes2018, and the appendix in Goffman Reference Goffman2014 for notable examples). Scholars who conduct interviews also can increase the transparency of their work by including appendices. Shesterinina (Reference Shesterinina2016) provided a particularly impressive example, describing in detail how she organized her fieldwork in Abkhazia, her interview settings and strategies, and how she recruited respondents and gained their trust. Bleich and Pekkanen (Reference Bleich, Pekkanen and Mosley2013) proposed a formalized “Interview Methods Appendix” comprising a descriptive list of interviews conducted (including, e.g., the source of the interview contact, structure, and length); Bleich (Reference Bleich2018) included such an appendix for a recent article.

Scholars who conduct archival research can augment their work’s transparency by providing a log that describes how cited primary or secondary historical sources were originally produced and why, among those consulted, a subset was selected for inclusion in the research (Gaikwad, Herrera, and Mickey Reference Gaikwad, Herrera and Mickey2019, 2; Verghese Reference Verghese2016, appendix 3, discusses the use and selection of secondary historical sources). Scholars whose work relies heavily on qualitative coding can enhance transparency by including with their publication a “coding appendix” that details how they arrived at their initial coding, resolved intercoder disagreements, and refined their schema. For instance, Fuji Johnson (Reference Fuji Johnson2017) included such an appendix to describe how she coded legislative discourse on sex work in Canada. Scholars who use process tracing can generate appendices to bolster analytic claims and make their role in an overall argument more explicit (Bennett, Fairfield, and Soifer Reference Bennett, Fairfield and Soifer2019, 9–10; Fairfield Reference Fairfield2013).

Other forms of research documentation also can be shared to increase transparency. For instance, scholars in education research have pioneered the use of reflective journals in which they record in detail the steps of the research process (Ortlipp Reference Ortlipp2008). Also, some scholars with large research teams periodically conduct “debriefing interviews” in which team members describe their decisions and actions throughout the research process (see, e.g., Collins et al. Reference Collins, Onwuegbuzie, Johnson and Frels2013, which discusses the use of this practice in Onwuegbuzie et al. Reference Onwuegbuzie, Frels, Leech and Collins2011). Sharing these journals and interview transcripts as part of a methodological appendix illuminates and clarifies key research steps and choices for readers.

Most generally, scholars can include as a methodological appendix the information that they assembled following “reporting guidelines” that set thresholds for information provision about data collection and analysis. Most commonly used in medical research, these guidelines exist for in-depth interviews and focus groups (COREQ: Tong, Sainsbury, and Craig Reference Tong, Sainsbury and Craig2007); for synthesizing qualitative research (ENTREQ: Tong et al. Reference Tong, Flemming, McInnes, Oliver and Craig2012); and for qualitative research in general (SRQR: O’Brien et al. Reference O’Brien, Harris, Beckman, Reed and Cook2014). Although they need to be adapted for use by different political science research communities, these guidelines strike us as a potentially fruitful way to consider, organize, and systematize ideas about what should be shared to achieve transparency.

The potential utility of methodological appendices notwithstanding, they can be hard to locate and their discussions difficult to connect to particular arguments in the text (Grossman and Pedahzur 2020, 2f.). Especially when placed on an author’s personal website, online appendices also are at risk of eventually becoming unavailable (Gertler and Bullock 2017, 167). As an alternative, scholars can publish methodological companion articles to augment the transparency of a primary publication. Doing so can lower the opportunity cost of pursuing transparency, allowing scholars to enhance both the transparency of their work and their publishing record.Footnote 11

Annotation

Annotation also can help scholars to achieve transparency. Two forms of annotation developed for political science inquiry are Active Citation (AC), pioneered by Moravcsik (Reference Moravcsik2014; Reference Moravcsik2019), and Annotation for Transparent Inquiry (ATI; see qdr.org/ati) (Gaikwad, Herrera, and Mickey Reference Gaikwad, Herrera and Mickey2019, 15–17), developed by the Qualitative Data Repository (with which both authors are affiliated). ATI builds on AC, using more sophisticated technology and placing greater emphasis on the value of sharing underlying data sources. ATI uses open-annotation technology to allow researchers to link specific passages in a publication to digital annotations comprising “analytic notes” and extended excerpts from data sources, as well as to the data sources themselves when they can be shared ethically and legally (Karcher and Weber Reference Karcher and Weber2019). Analytic notes can elucidate data generation or analysis, make explicit the link between a source and a claim in a published text, or discuss other aspects of the research process. Extended excerpts facilitate transparency even when sharing underlying data sources is not possible (Ellett Reference Ellett2016; and see the discussion of this project in Shesterinina, Pollack, and Arriola Reference Shesterinina, Pollack and Arriola2019, 23). These annotations, as well as an overview of the research process, comprise an ATI data supplement. By associating methodological discussion and underlying data with the precise point in the text to which they relate, ATI enhances scholars’ ability to demonstrate the rigor of their work without disturbing narrative flow. Given the relative novelty of the approach, however, researchers may find the creation of ATI supplements time-consuming.

QDA Software Output

Bringer, Johnston, and Brackenridge (Reference Bringer, Johnston and Brackenridge2004) highlighted how scholars can use qualitative data analysis (QDA) software—in particular, the memo/note function—to provide an “electronic audit trail” of the research process and the development of a project.Footnote 12 Similarly, based on their work in international business and management, Sinkovics and Alfoldi (Reference Sinkovics and Alfoldi2012) argued that QDA software can help to capture the nonlinear back-and-forth between data collection and analysis that is characteristic of much qualitative work, thereby improving its transparency and trustworthiness. QDA software also can help scholars to provide a coherent image of their data as a whole. Corti and Gregory (Reference Corti and Gregory2011) have long advocated for the sharing of QDA-generated qualitative data, and some researchers have shared excerpts from their QDA projects (Luigjes Reference Luigjes2019; O’Neill Reference O’Neill2017). Although the proprietary nature of QDA file formats had stymied these efforts, the recent emergence of an open-exchange format for QDA data—supported by the major software projects (see www.qdasoftware.org)—should help scholars to be transparent about the generation and analysis of data and to share the QDA data themselves. Sharing QDA-generated qualitative data has low opportunity costs because doing so does not entail the creation of a separate product (as with, e.g., a methodological appendix). It may be difficult, however, to disentangle shareable data from those that cannot be shared (for legal or ethical reasons) because such data typically are exported in their entirety.

Data Sharing

A final strategy that scholars can adopt to increase the transparency of their work is making the underlying data accessible to other researchers. This strategy intersects with some of those mentioned previously. For instance, some scholars include data as part of their methodological appendices or ATI annotations.

Achieving transparency does not require that scholars share all of the data that underpin a publication but rather calls on them to make careful choices about which data to share. For instance, ethical and legal constraints may limit which data can be shared. As noted previously, if human participants in a scholar’s research do not consent to the information they provide being shared more broadly, it cannot be shared. Likewise, it may not be possible to share documents that are under copyright. However, scholars can petition the copyright owner for permission to share such material (see, e.g., newspaper articles shared in association with Holland Reference Holland2019), and documents that are in the public domain can be freely shared. For instance, Hitt (Reference Hitt2019) shared papers of US Supreme Court justices that they had dedicated to the public domain.

Sharing data ethically, and ensuring that they are useful to others, may require scholars to take preparatory steps including cleaning, organizing, and documenting the data. The earlier in the research process that scholars take these steps, the less time-consuming they may be. Also, to protect human participants, scholars may need to de-identify data—that is, remove “direct identifiers” (i.e., pieces of information that are sufficient, on their own, to disclose an identity, such as proper names, addresses, and telephone numbers) and “indirect identifiers” (i.e., contextual information that can be used—often in combination with other information—to identify a participant).Footnote 13 Contreras (Reference Contreras2019, 11–16) explored three strategies for “partially disclosing” information about participants in dangerous research: semibiographical disclosure, partial spatial disclosure, and invitational disclosure (which involves inviting people to a field site to meet participants); see also Shesterinina, Pollack, and Arriola (Reference Shesterinina, Pollack and Arriola2019, 15–16).

Scholars can make their research data available in many venues. Best practice is to do so in institutions such as data repositories (Kapiszewski and Karcher Reference Kapiszewski, Karcher, Elman, Gerring and Mahoney2020). Scholars who share data in these venues can help to address ethical concerns about the data’s availability by placing “access controls” on the data that limit the number or type of individuals to whom they are available. Scholars also can combine strategies. For instance, Camp and Dunning (Reference Camp and Dunning2015) shared de-identified transcripts of interviews with political brokers in Argentina, describing the general region but not the specific location where the data were collected, and restricted access to the data to researchers with clearly specified research plans.

CONCLUSION: CONTINUING FORWARD

Debates about the challenges and benefits of research transparency, about the “content” of transparency, and about how precisely to achieve transparency in scholarly work, are proceeding across academic disciplines and geographies. Multiple innovative techniques have been developed to aid scholars to increase the transparency of their work within ethical and legal limits, and to help them balance the considerations that bear on the pursuit of transparency. The creation and use of these techniques highlight that “transparency” is not an all-or-nothing prospect: most work is neither left completely opaque nor made completely transparent but rather falls somewhere in between.

Indeed, it is important to remember that transparency is a means to an end, not an end in itself. As discussed here, transparency adds value by facilitating comprehension and assessment of our scholarship

[T]ransparency is a means to an end, not an end it itself.

. The goal and necessity of assessment are in no way new: our research is assessed informally every day by individual scholars and more formally periodically through the peer-review process. However, increasing transparency facilitates new ways to evaluate qualitative inquiry. The availability of shared data and materials also raises compelling questions. Can (and should) we use data and materials shared to augment the transparency of qualitative work to verify claims made in that work? If so, how can we develop forms of evaluation that accommodate the diverse epistemological commitments and methodological practices that make qualitative research such a rich and powerful form of inquiry? How can and should shared qualitative data be valued compared to traditional scholarly outputs (e.g., published articles)? Can shared qualitative data and materials be used in qualitative-methods instruction in ways similar to those in which their quantitative analogues are routinely used in quantitative-methods courses?

It is critically important that scholars who use qualitative data and methods continue to discuss all of these topics, to engage with one another within and across different qualitative research traditions, and to listen to and learn from one another. Broad ongoing involvement is crucial to the productivity of the conversation. Ultimately, however, we believe that the large and heterogeneous community of qualitative researchers will develop the best answers to the questions raised in this article and the broader symposium by actively seeking to make their work more transparent, employing the techniques discussed here and others that emerge. As they do so, research communities can draw on their examples to develop community-specific norms and practices for transparency, which funders, journal editors, and other institutions then can adopt. Both continued conversation and engaged practice are necessary for transparency to be deployed to its best purpose: to demonstrate the rigor of qualitative research and it valuable contributions to the production of knowledge.

ACKNOWLEDGMENTS

We are extremely grateful to the editors of this symposium for inviting us to participate and to this article’s three reviewers—Alan Jacobs, Tamarinde Haven (in signed reviews), and an anonymous reviewer—for their suggestions and recommendations, which made the article much stronger. Any remaining problems are our responsibility alone.This article is based upon work supported by the National Science Foundation under Grant No. 1823950.

Footnotes

1. This formulation follows the American Political Science Association’s (APSA) conceptualization of transparency (APSA 2012).

2. These reports were published on the Social Science Research Network (SSRN) in early 2019. A summary of the process and its findings authored by Büthe and Jacobs is forthcoming in Perspectives on Politics.

3. Our deep appreciation for the heterogeneity of qualitative political science notwithstanding, much of this article discusses “qualitative research” in general rather than considering different types of qualitative work individually. Our doing so is simply a function of space constraints.

4. The conversation among political scientists who work with quantitative data began much earlier; see Janz (Reference Janz2018) for an overview of the recent debate in the discipline. There also are earlier debates in other disciplines about sharing qualitative data, most notably among UK-based sociologists (Bishop Reference Bishop2005; Parry and Mauthner Reference Parry and Mauthner2004; Reference Parry and Mauthner2005).

5. Over time, we have compiled a large collection of the literature addressing transparency in qualitative research in a Zotero library that we have made public at www.zotero.org/groups/2379934/items. We invite readers to consult the collection.

6. These include symposia in two newsletters of the Qualitative and Multi-Method Research (QMMR) section of APSA (2012 and 2015), and in PS: Political Science & Politics (2014), Security Studies (2014), the APSA International History and Politics section newsletter (2016), and the APSA Comparative Politics section newsletter (2016).

7. See also Bishop (Reference Bishop2009), who advocated for extending our consideration of the ethics of sharing research data to include the effect that sharing may have on actors other than human participants (e.g., research communities and the public).

8. Simonsohn, Nelson, and Simmons (Reference Simonsohn, Nelson and Simmons2014) and Nosek et al. (Reference Nosek, Ebersole, DeHaven and Mellor2018) discuss preregistration in quantitative work.

9. See also Bellin et al. (Reference Bellin, Greitens, Herrera and Singerman2019, 7) and Shesterinina, Pollack, and Arriola (Reference Shesterinina, Pollack and Arriola2019, 20–21).

10. In an extreme example, one of several appendices to Treisman (Reference Treisman2020) is more than 2,000 pages long.

11. For instance, Jaramillo et al. (Reference Jaramillo, Maia, Mameli and Steiner2017) discussed the coding procedures in their research on ex-combatants in Colombia (Steiner et al. Reference Steiner, Jaramillo, Maia and Mameli2017), and Deterding and Waters (Reference Deterding and Waters2018) offered case memos and described the analytic codes and model building used across multiple publications in the Resilience in Survivors of Katrina (RISK) project (see www.riskproject.org).

12. QDA software allows the storage, coding, and annotation of sources typically used in qualitative work; two common packages are NVivo and Atlas.ti.

13. See also Stein (Reference Stein2010), who considered whether anonymity in ethnographic research necessarily serves scholars and human participants.

References

REFERENCES

American Political Science Association. 2012. A Guide to Professional Ethics in Political Science. 2nd ed. Washington, DC: American Political Science Association.Google Scholar
Barnes, Jeb, and Weller, Nicholas. 2017. “Case Studies and Analytic Transparency in Causal-Oriented Mixed-Methods Research.” PS: Political Science & Politics 50 (4): 1019–22. doi:10.1017/S1049096517001202.Google Scholar
Bellin, Eva, Greitens, Sheena Chestnut, Herrera, Yoshiko M., and Singerman, Diane. 2019. “Research in Authoritarian and Repressive Contexts.” In American Political Science Association Organized Section for Qualitative and Multi-Method Research, Qualitative Transparency Deliberations, Working Group Final Reports, Report IV.1 (2018). doi:10.2139/ssrn.3333496.CrossRefGoogle Scholar
Bennett, Andrew, Fairfield, Tasha, and Soifer, Hillel David. 2019. “Comparative Methods and Process Tracing.” In American Political Science Association Organized Section for Qualitative and Multi-Method Research, Qualitative Transparency Deliberations, Working Group Final Reports, Report III.1 (January). doi:10.2139/ssrn.3333405.CrossRefGoogle Scholar
Bishop, Libby. 2005. “Protecting Respondents and Enabling Data Sharing: Reply to Parry and Mauthner.” Sociology 39 (2): 333–36. doi:10.1177/0038038505050542.CrossRefGoogle Scholar
Bishop, Libby. 2009. “Ethical Sharing and Reuse of Qualitative Data.” Australian Journal of Social Issues 44 (3): 255–72. doi:10.1002/j.1839-4655.2009.tb00145.x.CrossRefGoogle Scholar
Bleich, Erik. 2018. “Historical Institutionalism and Judicial Decision-Making: Ideas, Institutions, and Actors in French High Court Hate Speech Rulings.” World Politics 70 (1): 5385.CrossRefGoogle Scholar
Bleich, Erik, and Pekkanen, Robert. 2013. “How to Report Interview Data.” In Interview Research in Political Science, ed. Mosley, Layna, 84106. Ithaca, NY: Cornell University Press.Google Scholar
Bringer, Joy D., Johnston, Lynne H., and Brackenridge, Celia H.. 2004. “Maximizing Transparency in a Doctoral Thesis1: The Complexities of Writing About the Use of QSR*NVIVO Within a Grounded Theory Study.” Qualitative Research 4 (2): 247–65. doi:10.1177/1468794104044434.CrossRefGoogle Scholar
Büthe, Tim, and Jacobs, Alan. 2015. “Introduction to the Symposium.” Qualitative & Multi-Method Research 13 (1): 28. doi:10.5281/zenodo.892931.Google Scholar
Camp, Edwin, and Dunning, Thad. 2015. “Brokers, Voters, and Clientelism: The Puzzle of Distributive Politics.” Qualitative Data Repository. doi:10.5064/F6Z60KZB.Google Scholar
Carusi, Annamaria, and Jirotka, Marina. 2009. “From Data Archive to Ethical Labyrinth.” Qualitative Research 9 (3): 285–98. doi:10.1177/1468794109105032.CrossRefGoogle Scholar
Chauvette, Amelia, Schick-Makaroff, Kara, and Molzahn, Anita E.. 2019. “Open Data in Qualitative Research.” International Journal of Qualitative Methods 18 (January): 16. doi:10.1177/1609406918823863.CrossRefGoogle Scholar
Christensen, Darin, Hartman, Alexandra, and Samii, Cyrus. 2019. “Legibility and External Investment: An Institutional Natural Experiment in Liberia.” Unpublished manuscript. https://darinchristensen.com/files/liberia-tenure.pdf.Google Scholar
Collins, Kathleen M. T., Onwuegbuzie, Anthony J., Johnson, R. Burke, and Frels, Rebecca K.. 2013. “Practice Note: Using Debriefing Interviews to Promote Authenticity and Transparency in Mixed Research.” International Journal of Multiple Research Approaches 7 (2): 271–84. doi:10.5172/mra.2013.7.2.271.CrossRefGoogle Scholar
Contreras, Randol. 2019. “Transparency and Unmasking Issues in Ethnographic Crime Research: Methodological Considerations.” Sociological Forum 34 (2): 293312. doi:10.1111/socf.12498.CrossRefGoogle Scholar
Corti, Louise. 2006. “Qualitative Archiving and Data Sharing: Extending the Reach and Impact of Qualitative Data.” IASSIST Quarterly 29 (3): 8. doi:10.29173/iq370.CrossRefGoogle Scholar
Corti, Louise, and Gregory, Arofan. 2011. “CAQDAS Comparability. What about CAQDAS Data Exchange?Forum: Qualitative Sozialforschung/Forum: Qualitative Social Research 12 (1). doi:10.17169/fqs-12.1.1634.Google Scholar
Deterding, Nicole M., and Waters, Mary C.. 2018. “Flexible Coding of In-Depth Interviews: A Twenty-First-Century Approach.” Sociological Methods & Research. doi:10.1177/0049124118799377.CrossRefGoogle Scholar
Elkins, Zachary, Spitzer, Scott, and Tallberg, Jonas. 2019. “Content Analysis, Non-Automated.” SSRN Electronic Journal. doi:10.2139/ssrn.3333485.Google Scholar
Ellett, Rachel. 2016. “Data for: ‘Democratic and Judicial Stagnation,’ in Pathways to Judicial Power in Transitional States: Perspectives from African Courts.” Qualitative Data Repository. doi:10.5064/f6pn93h4.CrossRefGoogle Scholar
Elman, Colin, Kapiszewski, Diana, and Lupia, Arthur. 2018. “Transparent Social Inquiry: Implications for Political Science.” Annual Review of Political Science 21 (1): 2947. doi:10.1146/annurev-polisci-091515-025429.CrossRefGoogle Scholar
Elman, Colin, and Lupia, Arthur. 2016. “DA-RT: Aspirations and Anxieties.” Comparative Politics Newsletter 26 (1): 4452. http://comparativenewsletter.com/files/archived_newsletters/newsletter_spring2016.pdf.Google Scholar
Fairfield, Tasha. 2013. “Going Where the Money Is: Strategies for Taxing Economic Elites in Unequal Democracies.” World Development 47 (July): 4257. doi:10.1016/j.worlddev.2013.02.011.CrossRefGoogle Scholar
Fuji Johnson, Genevieve. 2017. “Data for A Question of Respect: A Qualitative Text Analysis of Canadian Parliamentary Committee Hearings on PCEPA.” Qualitative Data Repository. doi:10.5064/f6z31wj1.Google Scholar
Fujii, Lee Ann. 2016. “The Dark Side of DA-RT.” Comparative Politics Newsletter 26 (1): 2527. Available at http://comparativenewsletter.com/files/archived_newsletters/newsletter_spring2016.pdf.Google Scholar
Gaikwad, Nikhar, Herrera, Veronica, and Mickey, Robert. 2019. “Text-Based Sources.” SSRN Electronic Journal. doi:10.2139/ssrn.3332891.Google Scholar
Gertler, Aaron L., and Bullock, John G.. 2017. “Reference Rot: An Emerging Threat to Transparency in Political Science.” PS: Political Science & Politics 50 (1): 166–71. doi:10.1017/S1049096516002353.Google Scholar
Gleditsch, Nils Petter, and Janz, Nicole. 2016. “Replication in International Relations.” International Studies Perspectives 17 (4): 361–66. doi:10.1093/isp/ekv003.Google Scholar
Goffman, Alice. 2014. On the Run: Fugitive Life in an American City. Chicago and London: University of Chicago Press.CrossRefGoogle Scholar
Golder, Matt, and Golder, Sona N.. 2016. “Letter from the Editors.” Comparative Politics Newsletter 26 (1): 110. http://comparativenewsletter.com/files/archived_newsletters/newsletter_spring2016.pdf.Google Scholar
Grossman, Jonathan, and Pedahzur, Ami. 2020. “Can We Do Better? Replication and Online Appendices in Political Science.” Perspectives on Politics. doi:10.1017/S1537592720001206.CrossRefGoogle Scholar
Haven, Tamarinde L., and Van Grootel, Leonie. 2019. “Preregistering Qualitative Research.” Accountability in Research 26 (3): 229–44. doi:10.1080/08989621.2019.1580147.CrossRefGoogle Scholar
Hitt, Matthew. 2019. “Replication Data for: Inconsistency and Indecision in the United States Supreme Court.” Qualitative Data Repository. doi:10.5064/F6W7QRSX.Google Scholar
Holland, Alisha. 2019. “Data for: Forbearance as Redistribution: The Politics of Informal Welfare in Latin America.” Qualitative Data Repository. doi:10.5064/f626jgpb.Google Scholar
Jacobs, Alan M. 2020. “Pre-Registration and Results-Free Review in Observational and Qualitative Research.” In The Production of Knowledge: Enhancing Progress in Social Science, ed. Elman, Colin, Gerring, John, and Mahoney, James, 221–64. Cambridge and New York: Cambridge University Press. doi:10.1017/9781108762519.009 CrossRefGoogle Scholar
Jacobs, Alan M., Tim Büthe, Ana M Arjona, Leonardo R. Arriola, Eva Bellin, Bennett, Andrew, Björkman, Lisa, et al. 2019. “Transparency in Qualitative Research: An Overview of Key Findings and Implications of the Deliberations.” Introduction to American Political Science Association Organized Section for Qualitative and Multi-Method Research, Qualitative Transparency Deliberations, Working Group Final Reports (July 2019). doi:10.2139/ssrn.3430025.CrossRefGoogle Scholar
Janz, Nicole. 2018. “Replication and Transparency in Political Science: Did We Make Any Progress?” Political Science Replication. July 14. https://politicalsciencereplication.wordpress.com/2018/07/14/replication-and-transparency-in-political-science-did-we-make-any-progress/amp/?__twitter_impression=true.Google Scholar
Jaramillo, Maria C., Maia, Rousiley C. M., Mameli, Simona, and Steiner, Jürg. 2017. “For More Transparency in Deliberative Research: Implications for Deliberative Praxis.” Journal of Public Deliberation 13 (2). doi: 10.16997/jdd.288.Google Scholar
Kapiszewski, Diana, and Karcher, Sebastian. 2020. “Making Research Data Accessible.” In The Production of Knowledge: Enhancing Progress in Social Science, ed. Elman, Colin, Gerring, John, and Mahoney, James, 197220. Cambridge and New York: Cambridge University Press. doi:10.1017/9781108762519.008.CrossRefGoogle Scholar
Karcher, Sebastian, and Weber, Nicholas. 2019. “Annotation for Transparent Inquiry: Transparent Data and Analysis for Qualitative Research.” IASSIST Quarterly 43 (2): 19. doi:10.29173/iq959.Google Scholar
Kern, Florian, and Gleditsch, Kristian Skrede. 2017. “Exploring Pre-Registration and Pre-Analysis Plans for Qualitative Inference.” Unpublished manuscript. doi:10.13140/RG.2.2.14428.69769.CrossRefGoogle Scholar
Lester, Jessica Nina, and Anders, Allison Daniel. 2018. “Engaging Ethics in Postcritical Ethnography: Troubling Transparency, Trustworthiness, and Advocacy.” Forum Qualitative Sozialforschung 19 (3). doi:10.17169/fqs-19.3.3060.Google Scholar
Luigjes, Christiaan. 2019. “Institutional Moral Hazard in the Regulation of Unemployment.” Qualitative Data Repository. doi:10.5064/f65bvecy.Google Scholar
Miguel, Edward, Camerer, Colin, Casey, Katherine, Cohen, Joshua, Esterling, Kevin Michael, Gerber, Alan, Glennerster, Rachel, et al. 2014. “Promoting Transparency in Social Science Research.” Science 343 (6166): 3031. doi:10.1126/science.1245317.CrossRefGoogle ScholarPubMed
Monroe, Kristen Renwick. 2018. “The Rush to Transparency: DA-RT and the Potential Dangers for Qualitative Research.” Perspectives on Politics 16 (1): 141–48. doi:10.1017/S153759271700336X.CrossRefGoogle Scholar
Moravcsik, Andrew. 2012. “Active Citation and Qualitative Political Science.” Qualitative & Multi-Method Research 10 (1): 3337. doi:10.5281/zenodo.917652.Google Scholar
Moravcsik, Andrew. 2014. “Trust, but Verify: The Transparency Revolution and Qualitative International Relations.” Security Studies 23 (4): 663–88. doi:10.1080/09636412.2014.970846.CrossRefGoogle Scholar
Moravcsik, Andrew. 2019. “Transparency in Qualitative Research.” Sage Research Methods Foundations. doi:10.4135/9781526421036863782.Google Scholar
Nosek, Brian A., Ebersole, Charles R., DeHaven, Alexander C., and Mellor, David T.. 2018. “The Preregistration Revolution.” Proceedings of the National Academy of Sciences 115 (11): 2600–606. doi:10.1073/pnas.1708274114.CrossRefGoogle ScholarPubMed
O’Brien, Bridget, Harris, Ilene, Beckman, Thomas, Reed, Darcy, and Cook, David. 2014. “Standards for Reporting Qualitative Research: A Synthesis of Recommendations.” Academic Medicine 89 (9): 1245–51. doi:10.1097/ACM.0000000000000388.CrossRefGoogle ScholarPubMed
O’Neill, Maureen. 2017. “High Performance School-Age Athletes at Australian Schools: A Study of Conflicting Demands.” Qualitative Data Repository. doi:10.5064/f6zp448b.Google Scholar
Onwuegbuzie, Anthony J., Frels, Rebecca K., Leech, Nancy L., and Collins, Kathleen M. T.. 2011. “A Mixed Research Study of Pedagogical Approaches and Student Learning in Doctoral-Level Mixed Research Courses.” International Journal of Multiple Research Approaches 5 (2): 169–99. doi:10.5172/mra.2011.5.2.169.CrossRefGoogle Scholar
Ortlipp, Michelle. 2008. “Keeping and Using Reflective Journals in the Qualitative Research Process.” The Qualitative Report 13 (4): 695705. https://nsuworks.nova.edu/tqr/vol13/iss4/8.Google Scholar
Parry, Odette, and Mauthner, Natasha S.. 2004. “Whose Data Are They Anyway? Practical, Legal and Ethical Issues in Archiving Qualitative Research Data.” Sociology 38 (1): 139–52. doi:10.1177/0038038504039366.CrossRefGoogle Scholar
Parry, Odette, and Mauthner, Natasha S.. 2005. “Back to Basics: Who Re-Uses Qualitative Data and Why ?Sociology 39 (2): 337–42. doi:10.1177/0038038505050543.CrossRefGoogle Scholar
Piñeiro, Rafael, and Rosenblatt, Fernando. 2016. “Pre-Analysis Plans for Qualitative Research.” Revista De Ciencia Politica 36 (3): 785–96. doi:10.4067/S0718-090X2016000300009.Google Scholar
Reyes, Victoria. 2018. “Three Models of Transparency in Ethnographic Research: Naming Places, Naming People, and Sharing Data.” Ethnography 19 (2): 204–26. doi:10.1177/1466138117733754.CrossRefGoogle Scholar
Saunders, Elizabeth N. 2014. “Transparency without Tears: A Pragmatic Approach to Transparent Security Studies Research.” Security Studies 23 (4): 689–98. doi:10.1080/09636412.2014.970405.CrossRefGoogle Scholar
Schneider, Carsten, Vis, Barbara, and Koivu, Kendra. 2019. “Set-Analytic Approaches, Especially Qualitative Comparative Analysis (QCA).” American Political Science Association Organized Section for Qualitative and Multi-Method Research, Qualitative Transparency Deliberations, Working Group Final Reports, Report III.4 (January 2019). doi:10.2139/ssrn.3333474.CrossRefGoogle Scholar
Schwartz-Shea, Peregrine, and Yanow, Dvora. 2016. “Legitimizing Political Science or Splitting the Discipline? Reflections on DA-RT and the Policy-Making Role of a Professional Association.” Politics and Gender 12 (3): e11. doi:10.1017/S1743923X16000428.CrossRefGoogle Scholar
Shesterinina, Anastasia. 2016. “Collective Threat Framing and Mobilization in Civil War.” American Political Science Review 110 (3): 411–27. doi:10.1017/S0003055416000277.CrossRefGoogle Scholar
Shesterinina, Anastasia, Pollack, Mark A., and Arriola, Leonardo R.. 2019. “Evidence from Researcher Interactions with Human Participants.” American Political Science Association Organized Section for Qualitative and Multi-Method Research, Qualitative Transparency Deliberations, Working Group Final Reports, Report II.2 (December 2018). doi:10.2139/ssrn.3333392.CrossRefGoogle Scholar
Simonsohn, Uri, Nelson, Leif D., and Simmons, Joseph P.. 2014. “P-Curve: A Key to the File-Drawer.” Journal of Experimental Psychology: General 143 (2): 534–47. doi:10.1037/a0033242.CrossRefGoogle ScholarPubMed
Sinkovics, Rudolf R., and Alfoldi, Eva A.. 2012. “Progressive Focusing and Trustworthiness in Qualitative Research: The Enabling Role of Computer-Assisted Qualitative Data Analysis Software (CAQDAS).” Management International Review 52 (6): 817–45. doi:10.1007/s11575-012-0140-5.CrossRefGoogle Scholar
Stein, Arlene. 2010. “Sex, Truths, and Audiotape: Anonymity and the Ethics of Exposure in Public Ethnography.” Journal of Contemporary Ethnography 39 (5): 554–68. doi:10.1177/0891241610375955.CrossRefGoogle Scholar
Steiner, Jürg, Jaramillo, Maria C., Maia, Rousiley C. M., and Mameli, Simona. 2017. Deliberation across Deeply Divided Societies: Transformative Moments. Cambridge and New York: Cambridge University Press.CrossRefGoogle Scholar
Tong, Allison, Flemming, Kate, McInnes, Elizabeth, Oliver, Sandy, and Craig, Jonathan. 2012. “Enhancing Transparency in Reporting the Synthesis of Qualitative Research: ENTREQ.” BMC Medical Research Methodology 12 (1): 181. doi:10.1186/1471-2288-12-181.CrossRefGoogle ScholarPubMed
Tong, Allison, Sainsbury, Peter, and Craig, Jonathan. 2007. “Consolidated Criteria for Reporting Qualitative Research (COREQ): A 32-Item Checklist for Interviews and Focus Groups.” International Journal for Quality in Health Care 19 (6): 349–57. doi:10.1093/intqhc/mzm042.CrossRefGoogle ScholarPubMed
Treisman, Daniel. 2020. “Democracy by Mistake: How the Errors of Autocrats Trigger Transitions to Freer Government.” American Political Science Review. 114 (3): 792810.CrossRefGoogle Scholar
Tripp, Aili Mari. 2018. “Transparency and Integrity in Conducting Field Research on Politics in Challenging Contexts.” Perspectives on Politics 16 (3): 728–38. doi:10.1017/S1537592718001056.CrossRefGoogle Scholar
Tsai, Alexander C., Kohrt, Brandon A., Matthews, Lynn T., Betancourt, Theresa S., Lee, Jooyoung K., Papachristos, Andrew V., Weiser, Sheri D., and Dworkin, Shari L.. 2016. “Promises and Pitfalls of Data Sharing in Qualitative Research.” Social Science & Medicine 169 (November): 191–98. doi:10.1016/j.socscimed.2016.08.004.CrossRefGoogle ScholarPubMed
Tuval-Mashiach, Rivka. 2017. “Raising the Curtain: The Importance of Transparency in Qualitative Research.” Qualitative Psychology 4 (2): 126–38. doi:10.1037/qup0000062.CrossRefGoogle Scholar
Verghese, Ajay. 2016. The Colonial Origins of Ethnic Violence in India. Studies of the Walter H. Shorenstein Asia–Pacific Research Center. Redwood City, CA: Stanford University Press.Google Scholar