Traditional analysis of test
results with Item Respons
e Theory (IRT) models, and classical test theory (CTT) provide unidi
mensional ability values, which allow for ranking of students according to their abilities. Cognitive diagnostic assessment (CDA) is aimed at providing formative di
agnostic feedback through a fine-grained reporting of learners’ skill mastery. Th
e present study, through application of deterministic input, noisy ‘‘and’’ gate [d
e la Torre & Douglas, 2004 (DINA) model], investigated the degree to which the items
of a high stakes r
eading comprehension test can provide diagnostically useful information. Through expert rating, a Q-
matrix including five subskills was deve
loped. Using DINA mode
l, student’s skill mastery profiles were produced and issues related to the diagnostic capacity of the reading test were discussed. Meanwhile, th
is study pedagogically demonstrates the application of the DINA model employing R software.