Hostname: page-component-599cfd5f84-9hh9z Total loading time: 0 Render date: 2025-01-07T06:19:33.462Z Has data issue: false hasContentIssue false

Diagnostic Classification Analysis of Problem-Solving Competence using Process Data: An Item Expansion Method

Published online by Cambridge University Press:  01 January 2025

Peida Zhan
Affiliation:
Zhejiang Normal University
Xin Qiao*
Affiliation:
University Of Maryland
*
Correspondence should bemade to Xin Qiao, Measurement, Statistics, and Evaluation, Department of Human Development and Quantitative Methodology, University of Maryland, 1230 Benjamin Building, College Park, MD20742, USA. Email: [email protected]

Abstract

Process data refer to data recorded in computer-based assessments (CBAs) that reflect respondents’ problem-solving processes and provide greater insight into how respondents solve problems, in addition to how well they solve them. Using the rich information contained in process data, this study proposed an item expansion method to analyze action-level process data from the perspective of diagnostic classification in order to comprehensively understand respondents’ problem-solving competence. The proposed method cannot only estimate respondents’ problem-solving ability along a continuum, but also classify respondents according to their problem-solving skills. To illustrate the application and advantages of the proposed method, a Programme for International Student Assessment (PISA) problem-solving item was used. The results indicated that (a) the estimated latent classes provided more detailed diagnoses of respondents’ problem-solving skills than the observed score categories; (b) although only one item was used, the estimated higher-order latent ability reflected the respondents’ problem-solving ability more accurately than the unidimensional latent ability estimated from the outcome data; and (c) interactions among problem-solving skills followed the conjunctive condensation rule, which indicated that the specific action sequence appeared only when a respondent mastered all required problem solving skills. In conclusion, the proposed diagnostic classification approach is feasible and promising analyzing process data.

Type
Application Reviews and Case Studies
Copyright
Copyright © 2022 The Author(s) under exclusive licence to The Psychometric Society

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

Supplementary Information The online version contains supplementary material available at https://doi.org/S0033312300005561a.

References

Akaike, H., (1981). Likelihood of a model and information criteria Journal of Econometrics 16 (1) 314 10.1016/0304-4076(81)90071-3CrossRefGoogle Scholar
Bock, R. D., Aitkin, M., (1981). Marginal maximum likelihood estimation of item parameters: Application of an EM algorithm Psychometrika 46 443459 10.1007/BF02293801CrossRefGoogle Scholar
Bozdogan, H., (1987). Model selection and Akaike’s information criterion (AIC): The general theory and its analytical extensions Psychometrika 52 (3) 345370 10.1007/BF02294361CrossRefGoogle Scholar
Chen, J., de la Torre, J., Zhang, Z., (2013). Relative and absolute fit evaluation in cognitive diagnosis modeling Journal of Educational Measurement 50 (2) 123140 10.1111/j.1745-3984.2012.00185.xCrossRefGoogle Scholar
Chen, Y., Culpepper, S. A., Chen, Y., Douglas, J., (2018). Bayesian estimation of the DINA Q matrix Psychometrika 83 (1) 89108 10.1007/s11336-017-9579-4 28861685CrossRefGoogle ScholarPubMed
Chen, Y., Culpepper, S. A., Wang, S., Douglas, J., (2018). A hidden Markov model for learning trajectories in cognitive diagnosis with application to spatial rotation skills Applied Psychological Measurement 42 (1) 523 10.1177/0146621617721250 29881110CrossRefGoogle ScholarPubMed
Chen, Y., Li, X., Liu, J., Ying, Z., (2017). Recommendation system for adaptive learning Applied Psychological Measurement 42 (1) 2441 10.1177/0146621617697959 29335659 5766274CrossRefGoogle ScholarPubMed
Chiu, C-Y (2013). Statistical refinement of the Q-matrix in cognitive diagnosis Applied Psychological Measurement 37 (8) 598618 10.1177/0146621613488436CrossRefGoogle Scholar
de la Torre, J., (2011). The generalized DINA model framework Psychometrika 76 (2) 179199 10.1007/s11336-011-9207-7CrossRefGoogle Scholar
de la Torre, J., Chiu, C-Y (2016). A general method of empirical Q-matrix validation Psychometrika 81 (2) 253273 10.1007/s11336-015-9467-8 25943366CrossRefGoogle ScholarPubMed
de la Torre, J., Douglas, J., (2004). Higher-order latent trait models for cognitive diagnosis Psychometrika 69 (3) 333353 10.1007/BF02295640CrossRefGoogle Scholar
de la Torre, J., Hong, Y., Deng, W., (2010). Factors affecting the item parameter estimation and classification accuracy of the DINA model Journal of Educational Measurement 47 (2) 227249 10.1111/j.1745-3984.2010.00110.xCrossRefGoogle Scholar
DiCerbo, K. E., Liu, J., Rutstein, D. W., Choi, Y., & Behrens, J. T. (2011). Visual analysis of sequential log data from complex performance assessments,” [Paper presentation]. In Annual meeting of the American Educational Research Association, New Orleans, LA, USA.Google Scholar
Greiff, S., Niepel, C., Scherer, R., Martin, R., (2016). Understanding students’ performance in a computer-based assessment of complex problem solving: An analysis of behavioral data from computer-generated log files Computers in Human Behavior 61 3646 10.1016/j.chb.2016.02.095CrossRefGoogle Scholar
Gu, Y., & Xu, G. (2019). Partial identifiability of restricted latent class models. arXiv preprint arXiv:1803.04353Google Scholar
He, Q., & von Davier, M. (2016). Analyzing process data from problem-solving items with n-grams: Insights from a computer-based large-scale assessment. In Rosen, Y. Ferrara, S. & Mosharraf, M. (Eds.), Handbook of research on technology tools for real-world skill development (Vol. 2, pp. 749–776). Hershey: Information Science Reference. https://doi.org/10.4018/978-1-4666-9441-5.ch029CrossRefGoogle Scholar
Howard, L., Johnson, J., & Neitzel, C. (2010). Examining learner control in a structured inquiry cycle using process mining. In Proceedings of the 3rd international conference on educational data mining (pp. 71–80). Available online at: https://files.eric.ed.gov/fulltext/ED538834.pdf#page=83 (Accessed August 26, 2018).Google Scholar
Junker, B. W., Sijtsma, K., (2001). Cognitive assessment models with few assumptions, and connections with nonparametric item response theory Applied Psychological Measurement 25 (3) 258272 10.1177/01466210122032064CrossRefGoogle Scholar
Jiao, H., Liao, D., Zhan, P., von Davier, M., Lee, Y-S (2019). Utilizing process data for cognitive diagnosis Handbook of diagnostic classification models New York, USA SpringerGoogle Scholar
Kroehne, U., Goldhammer, F., (2018). How to conceptualize, represent, and analyze log data from technology-based assessments? A generic framework and an application to questionnaire items Behaviormetrika 45 (2) 527563 10.1007/s41237-018-0063-yCrossRefGoogle Scholar
LaMar, M. M., (2018). Markov decision process measurement model Psychometrika 83 (1) 6788 10.1007/s11336-017-9570-0 28447309CrossRefGoogle ScholarPubMed
Leighton, J. P., Gierl, M. J., Hunka, S. M., (2004). The attribute hierarchy model: An approach for integrating cognitive theory with assessment practice Journal of Educational Measurement 41 (3) 205237 10.1111/j.1745-3984.2004.tb01163.xCrossRefGoogle Scholar
Levy, R. (2014). Dynamic Bayesian network modeling of game based diagnostic assessments (CRESST Report No. 837). Los Angeles, CA: University of California, National Center for Research on Evaluation, Standards, and Student Testing (CRESST), Center for Studies in Education, UCLA. Retrieved August 26, 2018, from https://files.eric.ed.gov/fulltext/ED555714.pdfGoogle Scholar
Liao, D., He, Q., Jiao, H., (2019). Mapping background variables with sequential patterns in problem-solving environments: An investigation of us adults’ employment status in PIAAC Frontiers in Psychology 10 646 10.3389/fpsyg.2019.00646 30971986 6445889CrossRefGoogle Scholar
Liu, H., Liu, Y., Li, M., (2018). Analysis of process data of PISA 2012 computer-based problem solving: Application of the modified multilevel mixture IRT model Frontiers in Psychology 9 1372 10.3389/fpsyg.2018.01372 30123171 6085588CrossRefGoogle ScholarPubMed
Liu, J., Xu, G., Ying, Z., (2012). Data-driven learning of Q-matrix Applied Psychological Measurement 36 (7) 548564 10.1177/0146621612456591 23926363CrossRefGoogle ScholarPubMed
Ma, W., de la Torre, J., (2020). Choosing between CDM and unidimensional IRT: The proportional reasoning test case Measurement Interdisciplinary Research and Perspectives 18 (2) 8796 10.1080/15366367.2019.1697122CrossRefGoogle Scholar
Ma, W., de la Torre, J., (2020). GDINA: An R package for cognitive diagnosis modeling Journal of Statistical Software 93 (14) 126 10.18637/jss.v093.i14CrossRefGoogle Scholar
Maris, E., (1999). Estimating multiple classification latent class models Psychometrika 64 (2) 187212 10.1007/BF02294535CrossRefGoogle Scholar
Masters, G. N., (1982). A Rasch model for partial credit model: Application of an EM algorithm Applied Psychological Measurement 16 (2) 159174Google Scholar
Maydeu-Olivares, A., (2013). Goodness-of-fit assessment of item response theory models Measurement 11 (3) 71101Google Scholar
OECD. (2014). PISA 2012 results: Creative problem solving: Students’ skills in tackling real-life problems (Volume V). PISA, OECD Publishing: https://doi.org/10.1787/9789264208070-enCrossRefGoogle Scholar
Qiao, X., Jiao, H., (2018). Data mining techniques in analyzing process data: A didactic Frontiers in Psychology 9 2231 10.3389/fpsyg.2018.02231 30532716 6265513CrossRefGoogle Scholar
Rabiner, L., (1989). A tutorial on hidden Markov models and selected applications in speech recognition Proceedings of IEEE 77 (2) 257285 10.1109/5.18626CrossRefGoogle Scholar
Ravand, H., (2016). Application of a cognitive diagnostic model to a high-stakes reading comprehension test Journal of Psychoeducational Assessment 34 (8) 782799 10.1177/0734282915623053CrossRefGoogle Scholar
Robitzsch, A., Kiefer, T., & Wu, M. (2020). TAM: Test analysis modules. [R package version 3.5-19]. Retrieved from http://CRAN.R-project.org/package=TAM.Google Scholar
Roduta Roberts, M., Alves, C. B., Chu, M-W Thompson, M., Bahry, L. M., Gotzmann, A., (2014). Testing expert-based versus student-based cognitive models for a grade 3 diagnostic mathematics assessment Applied Measurement in Education 27 (3) 173195 10.1080/08957347.2014.905787CrossRefGoogle Scholar
Sao Pedro, M. A., Baker, R. S. J. d., & Gobert, J. D. (2012). Improving construct validity yields better models of systematic inquiry, even with less information. In Masthoff, J. Mobasher, B. Desmarais, M. C. & Nkambou, R. (Eds.), User modeling, adaptation, and personalization: Proceedings of the 20th UMAP conference (pp. 249–260). Berlin, Germany: Springer.Google Scholar
Schwarz, G., (1978). Estimating the dimension of a model The Annals of Statistics 6 (2) 461464 10.1214/aos/1176344136CrossRefGoogle Scholar
Sclove, S. L., (1987). Application of model-selection criteria to some problems in multivariate analysis Psychometrika 52 (3) 333343 10.1007/BF02294360CrossRefGoogle Scholar
Shu, Z., Bergner, Y., Zhu, M., Hao, J., von Davier, A. A., (2017). An item response theory analysis of problem-solving processes in scenario-based tasks Psychological Test and Assessment Modeling 59 (1) 109131Google Scholar
Tang, F., & Zhan, P. (2021). Does diagnostic feedback promote learning? Evidence from a longitudinal cognitive diagnostic assessment. AERA Open. https://doi.org/10.13140/RG.2.2.23511.19365CrossRefGoogle Scholar
Tang, X., Wang, Z., Liu, J., Ying, Z., (2020). An exploratory analysis of the latent structure of process data via action sequence autoencoders British Journal of Mathematical and Statistical Psychology 74 (1) 133 10.1111/bmsp.12203 32442346CrossRefGoogle ScholarPubMed
Tatsuoka, K. K., (1983). Rule-space: An approach for dealing with misconceptions based on item response theory Journal of Educational Measurement 20 (4) 345354 10.1111/j.1745-3984.1983.tb00212.xCrossRefGoogle Scholar
Templin, J., Henson, R. A., (2006). Measurement of psychological disorders using cognitive diagnosis models Psychological Methods 11 (3) 287305 10.1037/1082-989X.11.3.287 16953706CrossRefGoogle ScholarPubMed
van der Linden, W. J., (2007). A hierarchical framework for modeling speed and accuracy on test items Psychometrika 72 (3) 287308 10.1007/s11336-006-1478-zCrossRefGoogle Scholar
von Davier, M., & Lee, Y.-S. (2019). Handbook of diagnostic classification models: Models and model extensions, applications, software packages. Springer.CrossRefGoogle Scholar
Wang, S., Chen, Y., (2020). Using response times and response accuracy to measure fluency within cognitive diagnosis models Psychometrika 85 600629 10.1007/s11336-020-09717-2 32816238CrossRefGoogle ScholarPubMed
Wang, W., Song, L., Chen, P., Meng, Y., Ding, S., (2015). Attribute-level and pattern-level classification consistency and accuracy indices for cognitive diagnostic assessment Journal of Educational Measurement 52 (4) 457476 10.1111/jedm.12096CrossRefGoogle Scholar
Wu, H-M (2019). Online individualised tutor for improving mathematics learning: A cognitive diagnostic model approach Educational Psychology 39 (10) 12181232 10.1080/01443410.2018.1494819CrossRefGoogle Scholar
Xu, H., Fang, G., Ying, Z., (2020). A latent topic model with Markovian transition for process data British Journal of Mathematical and Statistical Psychology 10.1111/bmsp.12197 33064318CrossRefGoogle Scholar
Zhan, P., (2020). A Markov estimation strategy for longitudinal learning diagnosis: Providing timely diagnostic feedback Educational and Psychological Measurement 80 (6) 11451167 10.1177/0013164420912318 33116330 7565115CrossRefGoogle ScholarPubMed
Zhan, P., Jiao, H., Liao, D., (2018). Cognitive diagnosis modelling incorporating item response times British Journal of Mathematical and Statistical Psychology 71 (2) 262286 10.1111/bmsp.12114 28872185CrossRefGoogle ScholarPubMed
Zhan, P., Jiao, H., Liao, D., Li, F., (2019). A longitudinal higher-order diagnostic classification model Journal of Educational and Behavioral Statistics 44 (3) 251281 10.3102/1076998619827593CrossRefGoogle Scholar
Zhu, M., Shu, Z., von Davier, A. A., (2016). Using networks to visualize and analyze process data for educational assessment Journal of Educational Measurement 53 (2) 190211 10.1111/jedm.12107CrossRefGoogle Scholar
Supplementary material: File

Zhan and Qiao supplementary material

Online Appendix
Download Zhan and Qiao supplementary material(File)
File 995.2 KB
Supplementary material: File

Zhan and Qiao supplementary material

Zhan and Qiao supplementary material 1
Download Zhan and Qiao supplementary material(File)
File 21.6 MB