Hostname: page-component-78c5997874-j824f Total loading time: 0 Render date: 2024-11-19T06:02:09.107Z Has data issue: false hasContentIssue false

Analytical Options for Single-Case Experimental Designs: Review and Application to Brain Impairment

Published online by Cambridge University Press:  02 October 2017

Rumen Manolov*
Affiliation:
Department of Social Psychology and Quantitative Psychology, Faculty of Psychology, University of Barcelona, Barcelona, Spain
Antonio Solanas
Affiliation:
Department of Social Psychology and Quantitative Psychology, Faculty of Psychology, University of Barcelona, Barcelona, Spain
*
Address for correspondence: Dr. Rumen Manolov, Departament de Psicologia Social i Psicologia Quantitativa (Secció de Psicologia Quantitativa), Facultat de Psicologia, Universitat de Barcelona, Passeig de la Vall d'Hebron, 171, 08035-Barcelona, Spain. E-mail: [email protected]
Get access

Abstract

Single-case experimental designs meeting evidence standards are useful for identifying empirically-supported practices. Part of the research process entails data analysis, which can be performed both visually and numerically. In the current text, we discuss several statistical techniques focusing on the descriptive quantifications that they provide on aspects such as overlap, difference in level and in slope. In both cases, the numerical results are interpreted in light of the characteristics of the data as identified via visual inspection. Two previously published data sets from patients with traumatic brain injury are re-analysed, illustrating several analytical options and the data patterns for which each of these analytical techniques is especially useful, considering their assumptions and limitations. In order to make the current review maximally informative for applied researchers, we point to free user-friendly web applications of the analytical techniques. Moreover, we offer up-to-date references to the potentially useful analytical techniques not illustrated in the article. Finally, we point to some analytical challenges and offer tentative recommendations about how to deal with them.

Type
Articles
Copyright
Copyright © Australasian Society for the Study of Brain Impairment 2017 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Beretvas, S.N., & Chung, H. (2008). A review of meta-analyses of single-subject experimental designs: Methodological issues and practice. Evidence-Based Communication Assessment and Intervention, 2, 129141.CrossRefGoogle Scholar
Callahan, C.D., & Barisa, M.T. (2005). Statistical process control and rehabilitation outcome: The single-subject design reconsidered. Rehabilitation Psychology, 50, 2433.Google Scholar
Campbell, J.M., & Herzinger, C.V. (2010). Statistics and single subject research methodology. In Gast, D.L. (Ed.), Single subject research methodology in behavioral sciences (pp. 417453). London, UK: Routledge.Google Scholar
Carter, M. (2013). Reconsidering overlap-based measures for quantitative synthesis of single-subject data what they tell us and what they don't. Behavior Modification, 37, 378390.Google Scholar
Center, B.A., Skiba, R.J., & Casey, A. (1985-1986). A methodology for the quantitative synthesis of intra-subject design research. The Journal of Special Education, 19, 387400.CrossRefGoogle Scholar
Crosbie, J. (1987). The inability of the binomial test to control Type I error with single-subject data. Behavioral Assessment, 9, 141150.Google Scholar
Douglas, J.M., Knox, L., De Maio, C., & Bridge, H. (2014). Improving communication-specific coping after traumatic brain injury: Evaluation of a new treatment using single-case experimental design. Brain Impairment, 15, 190201.CrossRefGoogle Scholar
Edgington, E.S., & Onghena, P. (2007). Randomization tests (4th ed). London: Chapman & Hall/CRC.Google Scholar
Fahmie, T.A., & Hanley, G.P. (2008). Progressing toward data intimacy: A review of within-session data analysis. Journal of Applied Behavior Analysis, 41, 319331.Google Scholar
Fisher, W.W., Kelley, M.E., & Lomas, J.E. (2003). Visual aids and structured criteria for improving visual inspection and interpretation of single-case designs. Journal of Applied Behavior Analysis, 36, 387406.CrossRefGoogle ScholarPubMed
Gage, N.A., & Lewis, T.J. (2013). Analysis of effect for single-case design research. Journal of Applied Sport Psychology, 25, 4660.Google Scholar
Grissom, R.J., & Kim, J.J. (2001). Review of assumptions and problems in the appropriate conceptualization of effect size. Psychological Methods, 6, 135146.Google Scholar
Hershberger, S.L., Wallace, D.D., Green, S.B., & Marquis, J.G. (1999). Meta-analysis of single-case data. In Hoyle, R.H. (Ed.), Statistical strategies for small sample research (pp. 107132). London, UK: Sage.Google Scholar
Heyvaert, M., & Onghena, P. (2014). Analysis of single-case data: Randomisation tests for measures of effect size. Neuropsychological Rehabilitation, 24, 507527.Google Scholar
Jenson, W.R., Clark, E., Kircher, J.C., & Kristjansson, S.D. (2007). Statistical reform: Evidence-based practice, meta-analyses, and single subject designs. Psychology in the Schools, 44, 483493.Google Scholar
Kazdin, A.E. (1978). Methodological and interpretive problems of single-case experimental designs. Journal of Consulting and Clinical Psychology, 46, 629642.Google Scholar
Kelley, K., & Preacher, K.J. (2012). On effect size. Psychological Methods, 17, 137152.CrossRefGoogle ScholarPubMed
Kratochwill, T.R., Hitchcock, J., Horner, R.H., Levin, J.R., Odom, S.L., Rindskopf, D.M., & Shadish, W.R. (2010). Single-case designs technical documentation. Retrieved from What Works Clearinghouse website: https://ies.ed.gov/ncee/wwc/Document/229.Google Scholar
Lane, J.D., & Gast, D.L. (2014). Visual analysis in single case experimental design studies: Brief review and guidelines. Neuropsychological Rehabilitation, 24, 445463.Google Scholar
Manolov, R., & Moeyaert, M. (2017a). How can single-case data be analyzed? Software resources, tutorial, and reflections on analysis. Behavior Modification, 41, 179228.Google Scholar
Manolov, R., & Moeyaert, M. (2017b). Recommendations for choosing single-case data analytical techniques. Behavior Therapy, 48, 97114.CrossRefGoogle ScholarPubMed
Manolov, R., Gast, D.L., Perdices, M., & Evans, J.J. (2014). Single-case experimental designs: Reflections on conduct and analysis. Neuropsychological Rehabilitation, 24, 634660.Google Scholar
Moeyaert, M., Ferron, J., Beretvas, S., & Van Den Noortgate, W. (2014a). From a single-level analysis to a multilevel analysis of since-case experimental designs. Journal of School Psychology, 52, 191211.CrossRefGoogle ScholarPubMed
Moeyaert, M., Ugille, M., Ferron, J., Beretvas, S. N., & Van Den Noortgate, W. (2014b). The influence of the design matrix on treatment effect estimates in the quantitative analyses of single-case experimental designs research. Behavior Modification, 38, 665704.Google Scholar
Ninci, J., Vannest, K.J., Willson, V., & Zhang, N. (2015). Interrater agreement between visual analysts of single-case data: A meta-analysis. Behavior Modification, 39, 510541.Google Scholar
Olive, M.L., & Smith, B.W. (2005). Effect size calculations and single subject designs. Educational Psychology, 25, 313324.Google Scholar
Ownsworth, T., Fleming, J., Desbois, J., Strong, J., & Kuipers, P.I.M. (2006). A metacognitive contextual intervention to enhance error awareness and functional outcome following traumatic brain injury: A single-case experimental design. Journal of the International Neuropsychological Society, 12, 5463.Google Scholar
Parker, R.I., & Brossart, D.F. (2003). Evaluating single-case research data: A comparison of seven statistical methods. Behavior Therapy, 34, 189211.Google Scholar
Parker, R.I., & Vannest, K.J. (2009). An improved effect size for single-case research: Nonoverlap of all pairs. Behavior Therapy, 40, 357367.Google Scholar
Parker, R.I., Cryer, J., & Byrns, G. (2006). Controlling baseline trend in single-case research. School Psychology Quarterly, 21, 418443.Google Scholar
Parker, R.I., Vannest, K.J., & Davis, J.L. (2011). Effect size in single-case research: A review of nine nonoverlap techniques. Behavior Modification, 35, 303322.Google Scholar
Perdices, M., & Tate, R.L. (2009). Single-subject designs as a tool for evidence-based clinical practice: Are they unrecognised and undervalued? Neuropsychological Rehabilitation, 19, 904927.Google Scholar
Pustejovsky, J.E. (2015). Measurement-comparable effect sizes for single-case studies of free-operant behavior. Psychological Methods, 20, 342359.Google Scholar
Schlosser, R.W. (2009). The role of single-subject experimental designs in evidence-based practice times. (FOCUS: Technical Brief 22). National Center for the Dissemination of Disability Research (NCDDR). Retrieved on June 29, 2016 from http://ktdrr.org/ktlibrary/articles_pubs/ncddrwork/focus/focus22/Focus22.pdf.Google Scholar
Scruggs, T.E., & Mastropieri, M.A. (2013). PND at 25: Past, present, and future trends in summarizing single-subject research. Remedial and Special Education, 34, 919.Google Scholar
Shadish, W.R., & Sullivan, K.J. (2011). Characteristics of single-case designs used to assess intervention effects in 2008. Behavior Research Methods, 43, 971980.Google Scholar
Shadish, W.R., Hedges, L.V., & Pustejovsky, J.E. (2014). Analysis and meta-analysis of single-case designs with a standardized mean difference statistic: A primer and applications. Journal of School Psychology, 52, 123147.Google Scholar
Smith, J.D. (2012). Single-case experimental designs: A systematic review of published research and current standards. Psychological Methods, 17, 510550.Google Scholar
Solanas, A., Manolov, R., & Onghena, P. (2010). Estimating slope and level change in N=1 designs. Behavior Modification, 34, 195218.Google Scholar
Strain, P.S., Kohler, F.W., & Gresham, F. (1998). Problems in logic and interpretation with quantitative syntheses of single-case research: Mathur and colleagues (1998) as a case in point. Behavioral Disorders, 24, 7485.Google Scholar
Swaminathan, H., Rogers, H.J., Horner, R., Sugai, G., & Smolkowski, K. (2014). Regression models for the analysis of single case designs. Neuropsychological Rehabilitation, 24, 554571.Google Scholar
Tate, R.L., Perdices, M., Rosenkoetter, U., McDonald, S., Togher, L., Shadish, W., . . . Vohra, S. (2016). The single-case reporting guideline In BEhavioural Interventions (SCRIBE) 2016: Explanation and elaboration. Archives of Scientific Psychology, 4, 1031.CrossRefGoogle Scholar
Tate, R.L., Perdices, M., Rosenkoetter, U., Wakima, D., Godbee, K., Togher, L., & McDonald, S. (2013). Revision of a method quality rating scale for single-case experimental designs and n-of-1 trials: The 15-item Risk of Bias in N-of-1 Trials (RoBiNT) Scale. Neuropsychological Rehabilitation, 23, 619638.Google Scholar
Tate, R.L., Rosenkoetter, U., Wakim, D., Sigmundsdottir, L., Doubleday, J., Togher, L., . . . Perdices, M. (2015). The risk-of-bias in N-of-1 trials (RoBiNT) scale: An expanded manual for the critical appraisal of single-case reports. Sydney, Australia: Author.Google Scholar
Valentine, J.C., Tanner-Smith, E.E., & Pustejovsky, J.E. (2016). Between-case standardized mean difference effect sizes for single-case designs: A primer and tutorial using the scdhlm web application. Oslo, Norway: The Campbell Collaboration. doi: 10.4073/cmpn.2016.3. Retrieved January 22, 2017 from https://campbellcollaboration.org/media/k2/attachments/effect_sizes_single_case_designs.pdf.Google Scholar
Wolery, M., Busick, M., Reichow, B., & Barton, E.E. (2010). Comparison of overlap methods for quantitatively synthesizing single-subject data. Journal of Special Education, 44, 1829.Google Scholar
Young, N.D., & Daly, E.J. III. (2016). An evaluation of prompting and reinforcement for training visual analysis skills. Journal of Behavioral Education, 25, 95119.Google Scholar
Zelinsky, N.A.M., & Shadish, W.R. (2016, January 25). A demonstration of how to do a meta-analysis that combines single-case designs with between-groups experiments: The effects of choice making on challenging behaviors performed by people with disabilities. Developmental Neurorehabilitation. Advance online publication. doi: 10.3109/17518423.2015.1100690.Google Scholar