Hostname: page-component-78c5997874-xbtfd Total loading time: 0 Render date: 2024-11-08T12:22:38.016Z Has data issue: false hasContentIssue false

Explaining Causal Findings Without Bias: Detecting and Assessing Direct Effects

Published online by Cambridge University Press:  13 September 2016

AVIDIT ACHARYA*
Affiliation:
Stanford University
MATTHEW BLACKWELL*
Affiliation:
Harvard University
MAYA SEN*
Affiliation:
Harvard University
*
Avidit Acharya is Assistant Professor of Political Science, Stanford University ([email protected]).
Matthew Blackwell is Assistant Professor of Government, Harvard University ([email protected]).
Maya Sen is Assistant Professor of Public Policy, Harvard University ([email protected]).

Abstract

Researchers seeking to establish causal relationships frequently control for variables on the purported causal pathway, checking whether the original treatment effect then disappears. Unfortunately, this common approach may lead to biased estimates. In this article, we show that the bias can be avoided by focusing on a quantity of interest called the controlled direct effect. Under certain conditions, the controlled direct effect enables researchers to rule out competing explanations—an important objective for political scientists. To estimate the controlled direct effect without bias, we describe an easy-to-implement estimation strategy from the biostatistics literature. We extend this approach by deriving a consistent variance estimator and demonstrating how to conduct a sensitivity analysis. Two examples—one on ethnic fractionalization’s effect on civil war and one on the impact of historical plough use on contemporary female political participation—illustrate the framework and methodology.

Type
Research Article
Copyright
Copyright © American Political Science Association 2016 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

Thanks to Adam Cohon, Allan Dafoe, Justin Esarey, Adam Glynn, Robin Harding, Gary King, Macartan Humphreys, Kosuke Imai, Bethany Lacina, Jacob Montgomery, Judea Pearl, Dustin Tingley, Teppei Yamamoto, and conference or workshop participants at Dartmouth, Harvard, Princeton, WashU, the Midwest Political Science Association meeting, and the Society for Political Methodology summer meeting for helpful discussions and comments. Thanks to Anton Strezhnev for valuable research assistance. Any remaining errors are our own. The methods in this article are available as an open-source R package, DirectEffects, at http://www.mattblackwell.org/software/direct-effects/. Code and data to replicate results in this article can be found at http://dx.doi.org/10.7910/DVN/VNXEM6.

References

REFERENCES

Acharya, Avidit, Blackwell, Matthew, and Sen, Maya. N.d. “The Political Legacy of American Slavery.” Journal of Politics. Forthcoming. http://www.journals.uchicago.edu/doi/abs/10.1086/686631 Google Scholar
Alesina, Alberto, Giuliano, Paola, and Nunn, Nathan. 2013. “On the Origins of Gender Roles: Women and the Plough.” Quarterly Journal of Economics 128 (2): 469530.Google Scholar
Angrist, Joshua D., and Pischke, Jörn-Steffen. 2008. Mostly Harmless Econometrics: An Empiricist’s Companion. Princeton: Princeton University Press.CrossRefGoogle Scholar
Banerjee, Abhijit, and Iyer, Lakshmi. 2005. “History, Institutions, and Economic Performance: The Legacy of Colonial Land Tenure Systems in India.” American Economic Review 95 (4): 1190–213.CrossRefGoogle Scholar
Blackwell, Matthew. 2013. “A Framwork for Dynamic Causal Inference in Political Science.” American Journal of Political Science 57 (2): 504–20. http://www.mattblackwell.org/files/papers/dynci.pdf CrossRefGoogle Scholar
Blackwell, Matthew. 2014. “A Selection Bias Approach to Sensitivity Analysis for Causal Effects.” Political Analysis 22 (2): 169–82.CrossRefGoogle Scholar
Dell, Melissa. 2010. “The Persistent Effects of Peru’s Mining Mita .” Econometrica 78 (6): 1863–903.Google Scholar
Fearon, James D., and Laitin, David D.. 2003. “Ethnicity, Insurgency, and Civil War.” American Political Science Review 97 (01): 7590.Google Scholar
Holland, Paul W. 1986. “Statistics and Causal Inference.” Journal of the American Statistical Association 81 (396): 945–60. http://www.jstor.org/stable/2289064 CrossRefGoogle Scholar
Imai, Kosuke, Keele, Luke, Tingley, Dustin, and Yamamoto, Teppei. 2011. “Unpacking the Black Box of Causality: Learning about Causal Mechanisms from Experimental and Observational Studies.” American Political Science Review 105 (04): 765–89.CrossRefGoogle Scholar
Imai, Kosuke, Keele, Luke, and Yamamoto, Teppei. 2010. “Identification, Inference and Sensitivity Analysis for Causal Mediation Effects.” Statistical Science 25 (1): 5171. http://projecteuclid.org/euclid.ss/1280841733 CrossRefGoogle Scholar
Imai, Kosuke, Tingley, Dustin, and Yamamoto, Teppei. 2013. “Experimental Designs for Identifying Causal Mechanisms.” Journal of the Royal Statistical Society. Series A (Statistics in Society) 176 (1): 551.CrossRefGoogle Scholar
Imai, Kosuke, and Yamamoto, Teppei. 2013. “Identification and Sensitivity Analysis for Multiple Causal Mechanisms: Revisiting Evidence from Framing Experiments.” Political Analysis 21 (2): 141–71. http://pan.oxfordjournals.org/content/21/2/141.abstract CrossRefGoogle Scholar
Imbens, Guido W. 2004. “Nonparametric Estimation of Average Treatment Effects Under Exogeneity: A Review.” Review of Economics and Statistics 86 (1): 429.CrossRefGoogle Scholar
Joffe, Marshall M., and Greene, Tom. 2009. “Related Causal Frameworks for Surrogate Outcomes.” Biometrics 65 (2): 530–8.Google Scholar
Joffe, Marshall M., Small, Dylan, and Hsu, Chi-Yuan. 2007. “Defining and Estimating Intervention Effects for Groups that will Develop an Auxiliary Outcome.” Statistical Science 22 (1): 7497.CrossRefGoogle Scholar
Neyman, Jerzy. 1923. “On the Application of Probability Theory to Agricultural Experiments. Essay on Principles. Section 9.” Statistical Science 5: 465–80. Translated in 1990, with discussion.Google Scholar
Nunn, Nathan, and Wantchekon, Leonard. 2011. “The Slave Trade and the Origins of Mistrust in Africa.” American Economic Review 101 (7): 3221–52.Google Scholar
Pearl, Judea. 2001. “Direct and Indirect Effects.” In Proceedings of the Seventeenth Conference on Uncertainty in Artificial Intelligence. UAI’01 San Francisco: Morgan Kaufmann Publishers Inc., 411–20. http://dl.acm.org/citation.cfm?id=2074022.2074073 Google Scholar
Robins, James M. 1986. “A New Approach to Causal Inference in Mortality Studies with Sustained Exposure Periods-Application to Control of the Healthy Worker Survivor Effect.” Mathematical Modelling 7 (9-12): 1393–512. http://biosun1.harvard.edu/~robins/new-approach.pdf Google Scholar
Robins, James M. 1994. “Correcting for Non-Compliance in Randomized Trials Using Structural Nested Mean Models.” Communications in Statistics 23 (8): 2379–412. http://www.hsph.harvard.edu/james-robins/files/2013/03/correcting-1994.pdf Google Scholar
Robins, James M. 1997. “Causal Inference from Complex Longitudinal Data.” In Latent Variable Modeling and Applications to Causality, ed. Berkane, M.. Vol. 120 of Lecture Notes in Statistics. New York: Springer-Verlag, 69117. http://biosun1.harvard.edu/~robins/cicld-ucla.pdf Google Scholar
Robins, James M. 2003. “Semantics of Causal DAG Models and the Identification of Direct and Indirect Effects.” In Highly Structured Stochastic Systems, eds. Green, P. J., Hjort, N. L. and Richardson, S.. Oxford: Oxford University Press, 7081.Google Scholar
Robins, James M., and Greenland, Sander. 1992. “Identifiability and Exchangeability for Direct and Indirect Effects.” Epidemiology 3 (2): 143–55.CrossRefGoogle ScholarPubMed
Robins, James M., Hernán, Miguel A., and Brumback, Babette A.. 2000. “Marginal Structural Models and Causal Inference in Epidemiology.” Epidemiology 11 (5): 550–60. http://www.jstor.org/stable/3703997 CrossRefGoogle ScholarPubMed
Rosenbaum, Paul R. 1984. “The Consequences of Adjustment for a Concomitant Variable That Has Been Affected by the Treatment.” Journal of the Royal Statistical Society. Series A (General) 147 (5): 656–66.CrossRefGoogle Scholar
Rubin, Donald B. 1974. “Estimating Causal Effects of Treatments in Randomized and Nonrandomized Studies.” Journal of Educational Psychology 6: 688701.CrossRefGoogle Scholar
Rubin, Donald B. 2004. “Direct and Indirect Causal Effects via Potential Outcomes.” Scandinavian Journal of Statistics 31 (2): 161–70.Google Scholar
VanderWeele, Tyler J. 2009. “Mediation and Mechanism.” European Journal of Epidemiology 24 (5): 217–24.Google Scholar
VanderWeele, Tyler J. 2011. “Controlled Direct and Mediated Effects: Definition, Identification and Bounds.” Scandinavian Journal of Statistics 38 (3): 551–63.CrossRefGoogle ScholarPubMed
VanderWeele, Tyler J. 2014. “A Unification of Mediation and Interaction: A 4-Way Decomposition.” Epidemiology 25 (5): 749–61.CrossRefGoogle ScholarPubMed
VanderWeele, Tyler J. and Tchetgen, Eric J. Tchetgen. 2014. “Attributing Effects to Interactions.” Epidemiology 25 (5): 711–22.CrossRefGoogle ScholarPubMed
Vansteelandt, Sijn. 2009. “Estimating Direct Effects in Cohort and Case–Control Studies.” Epidemiology 20 (6): 851–60.CrossRefGoogle ScholarPubMed
Vansteelandt, Sijn. 2010. “Estimation of Controlled Direct Effects on a Dichotomous Outcome Using Logistic Structural Direct Effect Models.” Biometrika 97 (4): 921–34.Google Scholar
Vansteelandt, Sijn, and Joffe, Marshall. 2014. “Structural Nested Models and G-estimation: The Partially Realized Promise.” Statistical Science 29 (4): 707–31.Google Scholar
Vansteelandt, Sijn, and VanderWeele, Tyler J.. 2009. “Conceptual Issues Concerning Mediation, Interventions and Composition.” Statistics and Its Interface 2: 457–68.Google Scholar
Supplementary material: PDF

Acharya supplementary material

Acharya supplementary material 1

Download Acharya supplementary material(PDF)
PDF 95 KB
Submit a response

Comments

No Comments have been published for this article.