Boosting Metacognition through In-Class Assessments

By Jennifer A. McCabe, Goucher College

Five years ago, I radically changed the assessment format of my undergraduate Human Learning and Memory course, from a more traditional model with three big exams to a frequent low-stakes testing approach, which involved administering a short quiz to start most class periods (and nixing big exams). This choice came on the heels of a shift from a content-driven focus to an elaboration and integration emphasis in this course, and a shift from a textbook to five popular press books about learning and memory (syllabus available by request). I decided that the assessments should more intentionally reflect a central goal for the course that students would come to class each day prepared to actively engage with and discuss the assigned material. When I first made this change, I gave little explicit thought to metacognitive development in my students; as I describe in this blog post, along the way I tweaked the way the quizzes were framed and administered to more transparently support metacognition.

The first iteration of the daily low-stakes assessment was described as a “KCA Quiz,” designed to assess (and improve) students’ Knowledge, Connection, and Application (“KCA”) of course topics and readings. At the start of most class periods, students had about 10 minutes to complete a 5-item open-book and open-notes quiz, which could include factual questions, connection questions, application questions, and thought/opinion questions. These would mainly focus on the reading assignment for the current day, but could also draw from prior assigned readings/topics. Students were expected to bring all their notes and books to class each day in hard copy (no electronics allowed), for reference. After collecting the quizzes, we discussed the answers as a large group, and then the quizzes were graded on a scale of 0 (absence) to 1 (0 or 1 correct) to 2 (2 or 3 correct) to a maximum of 3 (4 or 5 correct).

Certainly some elements of this assessment strategy had the potential to impact metacognition. For one, there was a consequence for lack of preparation if students could not find the answers in time. Also, receiving immediate feedback in class about whether their answers were correct should have given them insight into their learning. Yet, given what we now know about the power of retrieval practice (i.e., the testing effect; see recommended reading below), namely the inherent memory and metacognitive benefits, I was dissatisfied with the low (or no) expectation that students would retrieve the information from memory without using external sources. Therefore, I worked toward developing a modified version of “KCA Quizzes” that would better support retrieval practice and metacognition, yet preserve the low-stakes and frequent-testing components.

Starting in Fall 2016, I made several changes to this component of the course. First, I started calling them “KCA Assessments” instead of “KCA Quizzes.” Previously, I would get some complaints – in person and on course evaluations – about the pressure and stress of having a quiz every day. Simply shifting the language from “quiz” to “assessment” completely eliminated those complaints. I think that being “assessed” rather than “quizzed” activates a different schema for the students – perhaps representing a metacognitive shift from a performance focus to more of a learning focus.

This is particularly striking given that along with the name change, I made the assessments more challenging. Now they are a hybrid of closed- and open-books/notes, with a unique metacognitive twist. Students start by dividing their paper into left and right columns. For the first five minutes, they answer the five questions from memory (closed-book) in the left column. Then, I announce they can open their books and notes, and anything they want to add or modify about their answers is written in the right column. They know that I grade their answers (using the same generous scale described above) based only on whether they got them correct through the combination of closed- and open-notes. Yet using the left-right column method, they are forced not only to spend time effortfully retrieving the information (or even just trying – which as we discuss in class, still benefits memory), but also have a clear record of how easily and accurately they could arrive at correct answers from long-term memory, without consulting external sources. This supports metacognition by building students’ explicit awareness of their level of learning, which can then be used to guide their further learning behaviors. I encourage them to strive to be able to answer all the questions in the left column, but the pressure of testing is relieved by the back-up plan to consult course materials.

Recently, I administered a brief anonymous feedback survey about the KCA Assessments. Of the 21 students who responded in a class of 25, 86% agreed or strongly agreed that the assessments “improved my metacognition – that is, they helped me think and know more about my own learning and memory.” When asked an open-ended question about which aspect(s) of the assessments supported metacognition, 65% identified the closed/open-book hybrid approach, with comments such as:

“You couldn’t be convinced you know something if you couldn’t get it during the closed- book portion.”

“Helped me see what I actually remembered and what I needed help with or didn’t encode or couldn’t retrieve.”

“This allowed me to use my own memory to remember answers and gave me an idea of what I need to focus on more for when I read for the next class.”

“Having the closed then open note format really obviously shows what you processed more deeply than others.”

Other answers described the focus on deeper processing and application to real-life issues, learning to be more interactive and engaged with course reading assignments, getting immediate feedback after the assessments, and the reduced-pressure grading scale. The majority (90%) probably or definitely “would recommend keeping the KCA Assessments for future classes.”

The current iteration of this in-class assessment strategy grew from my own metacognitive insight as an instructor, with regard to balancing student learning, engagement, and incentives for examining and potentially changing learning strategies. Based on observing student performance, and on student feedback on course evaluations and from this survey, this approach is palatable (even enjoyable) for students, encourages deep and elaborative reading, supports durable memory for course material, and – at least by way of self-report – boosts metacognition in undergraduates.

Recommended Reading

Putnam, A. L., Sungkhasettee, V. W., & Roediger, H. L. (2016). Optimizing learning in college: Tips from cognitive psychology. Perspectives on Psychological Science, 11, 652-660. doi: 10.1177/1745691616645770

Roediger, H. L., & Pyc, M. A. (2012). Inexpensive techniques to improve education: Applying cognitive psychology to enhance educational practice. Journal of Applied Research in Memory and Cognition, 1, 242-248. http://dx.doi.org/10.1016/j.jarmac.2012.09.002