Hostname: page-component-cd9895bd7-lnqnp Total loading time: 0 Render date: 2024-12-23T05:14:03.346Z Has data issue: false hasContentIssue false

Experimental evidence, scaling and public policy: a perspective from developing countries

Published online by Cambridge University Press:  28 July 2020

ANANDI MANI*
Affiliation:
Blavatnik School of Government, University of Oxford, Oxford, UK
*
*Correspondence to: Blavatnik School of Government, University of Oxford, Oxford, UK. E-mail: [email protected]

Abstract

I highlight two important factors particular to less-developed countries that can bias evidence generation and contribute to the ‘voltage drop’ in programme benefits, moving from field research experiments to policy implementation at scale. The first is the non-linear increase in information processing and coordination costs associated with upscaling in less-developed countries, given limited state capacity and rigid organizational hierarchies. The second is political bias in the choice of programmes considered for rigorous evaluation itself, resulting in distorted evidence and policy choice. These two factors raise considerations that complement the economics-based approach outlined by Al-Ubaydli et al. in the quest for more rigorous, evidence-based policy.

Type
Articles
Copyright
Copyright © The Author(s) 2020. Published by Cambridge University Press

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Berk, R. A., Boruch, R. F., Chambers, D. L., Rossi, P. H. and Witte, A. D. (1985), ‘Social policy experimentation: A position paper’, Evaluation Review, 9(4): 387429.CrossRefGoogle Scholar
Blouin, A. and Mukand, S. W. (2019), ‘Erasing ethnicity? Propaganda, nation building, and identity in Rwanda’, Journal of Political Economy, 127(3): 10081062.CrossRefGoogle Scholar
Coate, S. and Morris, S. (1999), ‘Policy persistence’, American Economic Review, 89(5): 13271336.CrossRefGoogle Scholar
Dewatripont, M. and Maskin, E. (1995), ‘Credit and efficiency in centralized and decentralized economies’, Review of Economic Studies, 62(4): 541555.CrossRefGoogle Scholar
Duflo, E., Hanna, R. and Ryan, S. P. (2012), ‘Incentives work: Getting teachers to come to school’, American Economic Review, 102(4): 1241–78.CrossRefGoogle Scholar
Gawande, A. (2010), The Checklist Manifesto: How to Get Things Right, Profile Books, London.Google Scholar
Hirschman, A. O. (1968), ‘“Foreword,” pp. vii–viii, in Judith Tendler’, Electric Power in Brazil: Entrepreneurship in the Public Sector, Harvard University Press, viixii.Google Scholar
Jason, Z. (2017), “The Battle over Charter Schools”, Harvard Ed Magazine, Harvard Graduate School of Education, Summer 2017Google Scholar
Kaldor, N. (1934), ‘The equilibrium of the firm’, The Economic Journal, 44(173): 6076.CrossRefGoogle Scholar
Kleinberg, J., Lakkaraju, H., Leskovec, J., Ludwig, J. and Mullainathan, S. (2018), ‘Human decisions and machine predictions’, The quarterly journal of economics, 133(1): 237293.Google ScholarPubMed
Kornai, J., Maskin, E. and Roland, G. (2003), ‘Understanding the soft budget constraint’, Journal of economic literature, 41(4): 10951136.CrossRefGoogle Scholar
Majumdar, S. and Mukand, S. (2004), ‘Policy Gambles, American Economic Review 94(4): 12071222.CrossRefGoogle Scholar
Mani, A., Mullainathan, S., Shafir, E. and Zhao, J. (forthcoming 2020), ‘Scarcity and cognitive function around payday: A conceptual and empirical analysis’, Journal of Association of Consumer Research.CrossRefGoogle Scholar
Muralidharan, K. and Niehaus, P. (2017), ‘Experimentation at scale’, Journal of Economic Perspectives, 31(4): 103–24.CrossRefGoogle Scholar
Muralidharan, K. and Sundararaman, V. (2015), ‘The aggregate effect of school choice: Evidence from a two-stage experiment in India’, The Quarterly Journal of Economics, 130(3): 10111066.CrossRefGoogle Scholar
Pritchett, L. (2016), Is Your Impact Evaluation Asking Questions that Matter: A Four Part Smell Test? Center for Global Development: Commentary and Analysis.Google Scholar
Radner, R. (1993), ‘The organization of decentralized information processing’, Econometrica, 11091146.CrossRefGoogle Scholar
Radner, R. and Van Zandt, T. (1997), Real-time Decentralized Processing and Returns to Scale, New York University, Leonard N. Stern School of Business.Google Scholar
Sah, R. K. and Stiglitz, J. E. (1986), ‘The architecture of economic systems: Hierarchies and polyarchies’, The American Economic Review, 716727.Google Scholar
Salisbury, D. (2005), “School Choice: Learning from other Countries”, Commentary, Cato Institute.Google Scholar
Van Zandt, T. (1999), ‘Decentralized information processing in the theory of organizations’, In Contemporary economic issues, London: Palgrave Macmillan, 125160.CrossRefGoogle Scholar
Vivalt, E. (2020), “‘How Much Can We Generalize from Impact Evaluations?”. Journal of the European Economics Association.CrossRefGoogle Scholar