Skip to main content
placeholder image

Assessing what students know: Effects of assessment type on spelling performance and relation to working memory

Journal Article


Download full-text (Open Access)

Abstract


  • A central objective of educational assessment is to maximise the accuracy (validity) and consistency (reliability) of the methods used to assess students’ competencies. Different tests, however, often employ different methods of assessing the same domain-specific skills (e.g., spelling). As a result, questions have arisen concerning the legitimacy of using these various modes interchangeably as a proxy for students’ abilities. To investigate the merit of these contentions, this study examined university students’ spelling performance across three commonly employed test modalities (i.e., dictation, error correction, proofreading). To further examine whether these test types vary in the cognitive load they place on test takers, correlations between working memory and spelling scores were also examined. Results indicated that the modes of assessment were not equivalent indices of individuals’ orthographic knowledge. Specifically, performance in the dictation and error correction conditions were superior to that in the proofreading condition. Moreover, correlational analyses revealed that working memory accounted for significant variance in performance in the dictation and error correction conditions (but not in the proofreading condition). These findings suggest that not all standardised assessment methods accurately capture student competencies and that these domain-specific assessments should seek to minimise the domain-general cognitive demands placed on test takers.

Publication Date


  • 2014

Citation


  • Calleia, A. M. & Howard, S. J. (2014). Assessing what students know: Effects of assessment type on spelling performance and relation to working memory. The Journal of Student Engagement: Education Matters, 4 (1), 14-24.

Ro Full-text Url


  • http://ro.uow.edu.au/cgi/viewcontent.cgi?article=1030&context=jseem

Ro Metadata Url


  • http://ro.uow.edu.au/jseem/vol4/iss1/3/

Number Of Pages


  • 10

Start Page


  • 14

End Page


  • 24

Volume


  • 4

Issue


  • 1

Place Of Publication


  • Australia

Abstract


  • A central objective of educational assessment is to maximise the accuracy (validity) and consistency (reliability) of the methods used to assess students’ competencies. Different tests, however, often employ different methods of assessing the same domain-specific skills (e.g., spelling). As a result, questions have arisen concerning the legitimacy of using these various modes interchangeably as a proxy for students’ abilities. To investigate the merit of these contentions, this study examined university students’ spelling performance across three commonly employed test modalities (i.e., dictation, error correction, proofreading). To further examine whether these test types vary in the cognitive load they place on test takers, correlations between working memory and spelling scores were also examined. Results indicated that the modes of assessment were not equivalent indices of individuals’ orthographic knowledge. Specifically, performance in the dictation and error correction conditions were superior to that in the proofreading condition. Moreover, correlational analyses revealed that working memory accounted for significant variance in performance in the dictation and error correction conditions (but not in the proofreading condition). These findings suggest that not all standardised assessment methods accurately capture student competencies and that these domain-specific assessments should seek to minimise the domain-general cognitive demands placed on test takers.

Publication Date


  • 2014

Citation


  • Calleia, A. M. & Howard, S. J. (2014). Assessing what students know: Effects of assessment type on spelling performance and relation to working memory. The Journal of Student Engagement: Education Matters, 4 (1), 14-24.

Ro Full-text Url


  • http://ro.uow.edu.au/cgi/viewcontent.cgi?article=1030&context=jseem

Ro Metadata Url


  • http://ro.uow.edu.au/jseem/vol4/iss1/3/

Number Of Pages


  • 10

Start Page


  • 14

End Page


  • 24

Volume


  • 4

Issue


  • 1

Place Of Publication


  • Australia