Selecting a quantitative measure of metacognition  

by Dr. Jessica Santangelo, Hofstra University

If you are interested in metacognition and promoting the development of metacognitive skills, you may also be interested in measuring metacognition. But how does one assess a person’s metacognitive development?

Metacognitive development can be assessed via quantitative or qualitative measures. Quantitative measures include self-report measures, often using Likert-style survey instruments, while qualitative measures use coding of responses to open-ended prompts (e.g., Stanton 2015). While quantitative measures are generally easier and faster to score, a drawback is that self-report measures are not always accurate (Schunk 2008). Qualitative data can be more rich, providing deeper and more nuanced information, but is much more labor intensive and time consuming to analyze. Ideally, one uses a combination of quantitative and qualitative data to develop as complete a picture of metacognitive development as possible.

When I set out to assess the metacognitive development of 484 (!) students, I was overwhelmed by the number of quantitative tools available. The focus of the tools varies. Some tools attempt to assess metacognition directly while others assess factors or attributes associated with metacognition (e.g., study skills, self-regulated learning). Some are not explicitly intended to assess metacognition (e.g., LASSI), but are used by some authors as an indicator of metacognitive development (e.g., Downing et al 2007, 2011). Some have been through many iterations over the years (e.g., ASI, RASI, and ASSIST) while others remain relatively unchanged (e.g., MAI, MSLQ). Some are free while others have a per student fee. Some are longer (120 items, ILS) and others are shorter (18, RASI).

How does one choose the “best” quantitative tool? Unfortunately, there is no easy answer. It depends on the specific question being addressed and the amount of time and money available to administer the tool. I compiled a (non-comprehensive) list of tools I encountered in my search along with some information about each one to assist anyone looking for a quantitative measure of metacognitive development.

For my 484-student project, I chose to use the Metacognitive Awareness Inventory (MAI; Schraw and Dennison 1994) in combination with coding responses to open-ended prompts I created. I chose the MAI because it purports to measure metacognition directly (rather than being a study or learning skills inventory), is free, and is of moderate length (52 items). Others have found correlations between MAI results and other quantitative measures of student success (e.g., GPA and end of course grades), even suggesting using the MAI as a screening tool to identify students who could benefit from metacognition training (Young and Fry 2008). These characteristics  fit with the questions I was asking: Can we rapidly (and accurately) assess metacognitive development at the beginning of an introductory course? Does including explicit instruction and implicit practice with metacognitive skills in a course increase student metacognitive development?

While coding the open-ended responses is taking months to complete, it has revealed some clear and interesting patterns. In contrast, the quantitative data from the MAI, though gathered in about 5 minutes running scantron sheets through a machine, show no patterns at all. There does not appear to be any relationship between the quantitative MAI data and the qualitative data or any other measure of student success (GPA, exam and course grades, etc.). I’m not entirely surprised – metacognitive skills are unlikely to be wholly captured by a number generated by a 52-item self-report questionnaire. However, given the results of others (e.g., Sperling et al 2004, Young and Fry 2008) I was hopeful there would be at least some relationship between the quantitative and qualitative results.

This is not to say that rapid assessments via self-report questionnaires are worthless. It is simply a caution to not rely on these quantitative tools as one’s sole measure of metacognitive development. Indeed, I have colleagues who have had more “success” with tools other than the MAI (e.g, with the MSLQ), where success is defined as the quantitative tool reflecting similar patterns or trends as other, more time-consuming qualitative measures.

As with many things in science, there is no easy answer. My hope is that this compilation of available tools makes the choice of which one to use a little easier.

For more in-depth reading on measuring metacognition, I recommend:

Mogashana, D., J. M. Case, and D. Marshall. 2012. What do student learning inventories really measure? A critical analysis of students’ responses to the approaches to learning and studying inventory. Studies in Higher Education 37:783–792.

Schraw, G., and J. Impara, eds. 2000. Issues in the Measurement of Metacognition. Buros Institute of Mental Measurements, Lincoln, NE.

References

Downing, K., F. Ning, and K. Shin. 2011. Impact of problem‐based learning on student experience and metacognitive development. Multicultural Education & Technology Journal 5:55–69.

Downing, K., R. Ho, K. Shin, L. Vrijmoed, and E. Wong. 2007. Metacognitive development and moving away. Educational Studies 33:1–13.

Schraw, G., and R. S. Dennison. 1994. Assessing metacognitive awareness. Contemporary educational psychology 19:460–475.

Schunk, D. H. 2008. Metacognition, self-regulation, and self-regulated learning: research recommendations. Educational Psychology Review 20:463–467.

Sperling, R. A., B. C. Howard, R. Staley, and N. DuBois. 2004. Metacognition and self-regulated learning constructs. Educational Research and Evaluation 10:117–139.

Stanton, J. D., X. N. Neider, I. J. Gallegos, and N. C. Clark. 2015. Differences in metacognitive regulation in introductory biology students: when prompts are not enough. CBE-Life Sciences Education 14:ar15.

Young, A., and J. Fry. 2008. Metacognitive awareness and academic achievement in college students. Journal of the Scholarship of Teaching and Learning 8:1–10.