Part One: Does Processing Fluency Really Matter for Metacognition in Actual Learning Situations?

By Michael J. Serra, Texas Tech University

Part I: Fluency in the Laboratory

Much recent research demonstrates that learners judge their knowledge (e.g., memory or comprehension) to be better when information seems easy to process and worse when information seems difficult to process, even when eventual test performance is not predicted by such experiences. Laboratory-based researchers often argue that the misuse of such experiences as the basis for learners’ self-evaluations can produce metacognitive illusions and lead to inefficient study. In the present post, I review these effects obtained in the laboratory. In the second part of this post, I question whether these outcomes are worth worrying about in everyday, real-life learning situations.

What is Processing Fluency?

Have you ever struggled to hear a low-volume or garbled voicemail message, or struggled to read small or blurry printed text? Did you experience some relief after raising the volume on your phone or putting on your reading glasses and trying again? What if you didn’t have your reading glasses with you at the time? You might still be able to read the small printed text, but it would take more effort and might literally feel more effortful than if you had your glasses on. Would the feeling of effort you experienced while reading without your glasses affect your appraisal of how much you liked or how well you understood what you read?

When we process information, we often have a co-occurring experience of processing fluency: the ease or difficulty we experience while physically processing that information. Note that this experience is technically independent of the innate complexity of the information itself. For example, an intricate and conceptually-confusing physics textbook might be printed in a large and easy to read font (high difficulty, perceptually fluent), while a child might express a simple message to you in a voice that is too low to be easily understood over the noise of a birthday party (low difficulty, perceptually disfluent).

Fluency and Metacognition

Certainly, we know that the innate complexity of learning materials is going to relate to students’ acquisition of new information and eventual performance on tests. Put differently, easy materials will be easy for students to learn and difficult materials will be difficult for students to learn. And it turns out that perceptual disfluency – difficulty processing information – can actually improve memory under some limited conditions (for a detailed examination, see Yue et al., 2013). But how does processing fluency affect students’ metacognitive self-evaluations of their learning?

In the modal laboratory-based examination of metacognition (for a review, see Dunlosky & Metcalfe, 2009), participants study learning materials (these might be simple memory materials or complex reading materials), make explicit metacognitive judgments in which they rate their learning or comprehension for those materials, and then complete a test over what they’ve studied. Researchers can then compare learners’ judgments to their test performance in a variety of ways to determine the accuracy of their self-evaluations (for a review, see Dunlosky & Metcalfe, 2009). As you might know from reading other posts on this website, we usually want learners to accurately judge their learning so they can make efficient decisions on how to allocate their study time or what information to focus on when studying. Any factor that can reduce that accuracy is likely to be problematic for ultimate test performance.

Metacognition researchers have examined how fluency affects participants’ judgments of their learning in the laboratory. The figure in this post includes several examples of ways in which researchers have manipulated the visual perceptual fluency of learning materials (i.e., memory materials or reading materials) to be perceptually disfluent compared to a fluent condition. fluencyThese manipulations involving visual processing fluency include presenting learning materials in an easy-to-read versus difficult-to-read typeface either by literally blurring the font (Yue et al., 2013) or by adjusting the colors of the words and background to make them easy versus difficult to read (Werth & Strack, 2003), in an upside-down versus right-side up typeface (Sungkhasettee et al., 2011), and using normal capitalization versus capitalizing every other letter (Mueller et al., 2013). (A conceptually similar manipulation for auditory perceptual fluency might include making the volume high versus low, or the auditory quality clear versus garbled.).

A wealth of empirical (mostly laboratory-based) research demonstrates that learners typically judge perceptually-fluent learning materials to be better-learned than perceptually-disfluent learning materials, even when learning (i.e., later test performance) is the same for the two sets of materials (e.g., Magreehan et al., 2015; Mueller et al., 2013; Rhodes & Castel, 2008; Susser et al., 2013; Yue et al., 2013). Although there is a current theoretical debate as to why processing fluency affects learners’ metacognitive judgments of their learning (i.e., Do the effects stem from the experience of fluency or from explicit beliefs about fluency?, see Magreehan et al., 2015; Mueller et al., 2013), it is nevertheless clear that manipulations such as those in the figure can affect how much students think they know. In terms of metacognitive accuracy, learners are often misled by feelings of fluency or disfluency that are neither related to their level of learning nor predictive of their future test performance.

As I previously noted, laboratory-based researchers argue that the misuse of such experiences as the basis for learners’ self-evaluations can produce metacognitive illusions and lead to inefficient study. But, this question has yet to receive much empirical scrutiny in more realistic learning situations. I explore the possibility that such effects will also obtain with realistic learning situations in the second part of this post.

References

Dunlosky, J., & Metcalfe, J. (2009). Metacognition. Thousand Oaks, CA US: Sage Publications, Inc.

Magreehan, D. A., Serra, M. J., Schwartz, N. H., & Narciss, S. (2015, advanced online publication). Further boundary conditions for the effects of perceptual disfluency on judgments of learning. Metacognition and Learning.

Mueller, M. L., Tauber, S. K., & Dunlosky, J. (2013). Contributions of beliefs and processing fluency to the effect of relatedness on judgments of learning. Psychonomic Bulletin & Review, 20, 378-384.

Rhodes, M. G., & Castel, A. D. (2008). Memory predictions are influenced by perceptual information: evidence for metacognitive illusions. Journal of Experimental Psychology: General, 137, 615-625.

Sungkhasettee, V. W., Friedman, M. C., & Castel, A. D. (2011). Memory and metamemory for inverted words: Illusions of competency and desirable difficulties. Psychonomic Bulletin & Review, 18, 973-978.

Susser, J. A., Mulligan, N. W., & Besken, M. (2013). The effects of list composition and perceptual fluency on judgments of learning (JOLs). Memory & Cognition, 41, 1000-1011.

Werth, L., & Strack, F. (2003). An inferential approach to the knew-it-all-along phenomenon. Memory, 11, 411-419.

Yue, C. L., Castel, A. D., & Bjork, R. A. (2013). When disfluency is—and is not—a desirable difficulty: The influence of typeface clarity on metacognitive judgments and memory. Memory & Cognition, 41, 229-241.


Unskilled and Unaware: A Metacognitive Bias

by John R. Schumacher, Eevin Akers, & Roman Taraban (all from Texas Tech University).

In 1995, McArthur Wheeler robbed two Pittsburgh banks in broad daylight, with no attempt to disguise himself. When he was arrested that night, he objected “But I wore the juice.” Because lelemonmon juice can be used as an invisible ink, Wheeler thought that rubbing his face with lemon juice would make it invisible to surveillance cameras in the banks. Kruger and Dunning (1999) used Wheeler’s story to exemplify a metacognitive bias through which relatively unskilled individuals overestimate their skill, being both unaware of their ineptitude and holding an inflated sense of their knowledge or ability. This is called the Dunning-Kruger effect, and it also seems to apply to some academic settings. For example, Kruger and Dunning found that some students are able to accurately predict their performance prior to taking a test. That is, these students predict that they will do well on the test and actually perform well on the test. Other students predict that they will do well on a test, but do poorly on the test. These students tend to have an inflated sense of how well they will do but do poorly, thus they fit the Dunning-Kruger effect. Because these students’ predictions do not match their performance, we describe them as poorly calibrated. Good calibration involves metacognitive awareness. This post explores how note taking relates to calibration and metacognitive awareness.

Some of the experiments in our lab concern the benefits of note taking. In these experiments, students were presented with a college lecture. Note takers recalled more than non-notetakers, who simply watched the video (Jennings & Taraban, 2014). The question we explored was whether good note taking skills improved students’ calibration of how much they know and thereby reduced the unskilled and unaware effect reported by Kruger and Dunning (1999).

In one experiment, participants watched a 30-minute video lecture while either taking notes (notetakers) or simply viewing the video (non-notetakers). They returned 24 hours later. They predicted the percentage of information they believed they would recall, using a scale of 0 to 100, and then took a free-recall test, without being given an opportunity to study their notes or mentally review the prior day’s video lecture. They then studied their notes (notetakers) or mentally reviewed the lecture (non-notetakers) from the previous day, for12 minutes, and took a second free-recall test. In order to assess the Dunning-Kruger effect, we subtracted the actual percent of lecture material that was recalled in each test (0 to 100) from participants’ predictions of how much they would recall on each test (0 to 100). For example, if a participant predicted he or she would correctly recall 75% of the material on a test and actually recalled 50% the calibration score would be +25 (75 – 50 = 25). Values close to +100 indicated extreme overconfidence, values close to -100 indicated extreme underconfidence, and values close to 0 indicated good calibration. To answer our question about how note taking relates to calibration, we compared the calibration scores for the two groups (note takers and non-notetakers) for the two situations (before reviewing notes or reflecting, and after reviewing notes or reflecting). These analyses indicated that the two groups did not differ in calibration for the first, free recall test. However, to our surprise, note takers became significantly more overconfident, and thus less calibrated in their predictions, than non-notetakers on the second test. After studying, notetakers’ calibration became worse.

Note taking increases test performance. So why doesn’t note taking improve calibration? Since note takers are more “skilled”, that is, have encoded and stored more information from the lecture, shouldn’t they be more “aware”, that is, better calibrated, as the Dunning-Kruger effect would imply? One possible explanation is that studying notes immediately increases the amount of information processed in working memory. The information that participants will be asked to recall shortly is highly active and available. This sense of availability produces the inflated (and false) prediction that much information will be remembered on the test. Is this overconfidence harmful to the learner? It could be harmful to the extent that individuals often self-generate predictions of how well they will do on a test in order to self-regulate their study behaviors. Poor calibration of these predictions could lead to the individual failing to recognize that he or she requires additional study time before all material is properly stored and able to be recalled.

If note taking itself is not the problem, then is there some way students can improve their calibration after studying in order to better regulate subsequent study efforts? The answer is “yes.” Research has shown that predictions of future performance improve if there is a short delay between studying information and predicting subsequent test performance (Thiede, Dunlosky, Griffin, & Wiley, 2005). In order to improve calibration after studying notes, students should be encouraged to wait, after studying their notes, before judging whether they need additional study time. In order to improve metacognitive awareness with respect to calibration, students need to understand that immediate judgments of how much they know may be inflated. They need to be aware that waiting a short time before judging whether they need more study will result in more effective self-regulation of study time.

References
Jennings, E., & Taraban, R. (May, 2014). Note-taking in the modern college classroom: Computer, paper and pencil, or listening? Paper presented at the Midwestern Psychological Association (MPA), Chicago, IL.

Kruger, J., & Dunning, D. (1999). Unskilled and unaware of it: how difficulties in recognizing one’s own incompetence lead to inflated self-assessments. Journal of personality and social psychology, 77(6), 1121.

Thiede, K. W., Dunlosky, J., Griffin, T. D., & Wiley, J. (2005). Understanding the delayed-keyword effect on metacomprehension accuracy. Journal of Experimental Psychology: Learning, Memory, and Cognition, 31(6), 1-25.


Teach Students How to Learn: A review of Saundra McGuire’s strategy-packed book

by Jessica Santangelo, Ph.D. Hofstra University

For those interested in helping students develop strong metacognitive skills, Dr. Saundra McGuire’s book, Teach Students How to Learn: Strategies You Can Incorporate Into Any Course to Improve Student Metacognition, Study Skills, and Motivation, is concise, practical, and much less overwhelming than trying to figure out what to do on your own. It is both a consolidation of the research surrounding metacognition, mindset, and motivation and a how-to guide for putting that research into practice.

I have been interested in metacognition for several years. Having waded through the literature on teaching metacognition (e.g., using tutors, student self-check, writing assignments, reflective writing, learning records, “wrappers”, or any number of other strategies) I found Dr. McGuire’s book to be an excellent resource. It places many of the strategies I already use in my courses in a larger context which helps me better articulate to my students and colleagues why I am teaching those strategies. I also picked up a few strategies I had not used previously.

While metacognition is the focus of the book, Dr. McGuire includes strategies for promoting a growth mindset (Chapter 4) and for boosting student motivation (Chapters 7, 8 and 9). I hadn’t expected such an explicit focus on these two topics, but the book makes clear why they are important: they increase the probability of success. If students (and faculty) have a growth mindset, believing that success is due to behaviors and actions rather than innate talent or being “smart”, they are more likely to embrace the metacognitive strategies outlined in the book. The same principle applies to a person’s emotional state. Both emotions and learning arise in the brain and affect each other. If students and faculty are motivated to learn, they are more likely to embrace the metacognitive strategies.

The part of the book that is perhaps most practically useful is Chapter 11: Teaching Learning Strategies to Groups. Dr. McGuire details an approach she has honed over many years to teach metacognitive skills to groups of students in one, 50-minute presentation (a detailed discussion of the metacognitive skills and evidence for them are provided in Chapters 3-5). Slides that can be tailored for any course are available at the book’s accompanying website, along with a video of Dr. McGuire giving the presentation throughout which she sprinkles in data and anecdotes that foster a growth mindset and increase student motivation.

Before reading Dr. McGuire’s book, I had had success using several strategies to promote student metacognition. I had a student go from failing exams to making high C’s, and other students move from C’s to B’s and A’s. However, I felt like my approach was haphazard since I had pulled ideas from different places in the literature without a cohesive framework for implementation. The book provided the framework I was missing.

This semester, I decided to use Dr. McGuire’s cohesive 50-minute session to see its impact on my students. I adapted it to be an online workshop because 1) I have limited class time this semester, and 2) an online intervention may benefit my colleagues who are interested in this approach but who aren’t able to use a class period for this purpose. In addition to the online workshop, I re-emphasize key points from the book when students come to office hours. I use phrasing and examples presented in the book to reinforce a growth mindset and boost motivation. I intentionally discuss “metacognitive learning strategies” rather than “study skills” because, as Dr. McGuire points out, many students think they have all the “study skills” they need but are often intrigued by how “metacognitive learning strategies” (which most have not heard of before) could help them.

You can jump in with both feet, as I did, or start with one or two strategies and build from there. Either way, this book allows you to take advantage of Dr. McGuire’s extensive experience as Director Emerita of the Center for Academic Success at LSU. I anticipate my copy will become dogeared with use as I continue to be metacognitive about my teaching and the strategies that work best for me, my students, and my colleagues. Stay tuned for an update on my online adaptation of Dr. McGuire’s session once the semester wraps up!


When is Metacognitive Self-Assessment Skill “Good Enough”?

Ed Nuhfer Retired Professor of Geology and Director of Faculty Development and Director of Educational Assessment, enuhfer@earthlink.net, 208-241-5029 (with Steve Fleisher CSU-Channel Islands; Christopher Cogan, Independent Consultant; Karl Wirth, Macalester College and Eric Gaze, Bowdoin College)

We noted the statement by Zell and Krizan (2014, p. 111) that: “…it remains unclear whether people generally perceive their skills accurately or inaccurately” in Nuhfer, Cogan, Fleisher, Gaze and Wirth (2016) In our paper, we showed why innumeracy is a major barrier to the understanding of metacognitive self-assessment.

Another barrier to progress exists because scholars who attempt separately to do quantitative measures of self-assessment have no common ground from which to communicate and compare results. This occurs because there is no consensus on what constitutes “good enough” versus “woefully inadequate” metacognitive self-assessment skills. Does overestimating self-assessment skill by 5% allow labeling a person as “overconfident?” We do not believe so. We think that a reasonable range must be exceeded before those labels should be considered to apply.

The five of us are working now on a sequel to our above Numeracy paper. In the sequel, we interpret the data taken from 1154 paired measures from a behavioral science perspective. This extends our first paper’s describing of the data through graphs and numerical analyses. Because we had a database of over a thousand participants, we decided to use it to propose the first classification scheme for metacognitive self-assessment. It defines categories based on the magnitudes of self-assessment inaccuracy (Figure 1).

metacogmarchfig1
Figure 1. Draft of a proposed classification scheme for metacognitive self-assessment based upon magnitudes of inaccuracy of self-assessed competence as determined by percentage points (ppts) differences between ratings of self-assessed competence and scores from testing of actual competence, both expressed in percentages.

If you wonder where the “good” definition comes from in Figure 1, we disclosed on page 19 of our Numeracy paper: “We designated self-assessment accuracies within ±10% of zero as good self-assessments. We derived this designation from 69 professors self-assessing their competence, and 74% of them achieving accuracy within ±10%.”

The other breaks that designate “Adequate,” “Marginal,” “Inadequate,” and “Egregious” admittedly derive from natural breaks based upon measures expressed in percentages. By distribution through the above categories, we found that over 2/3 of our 1154 participants had adequate self-assessment skills, a bit over 21% exhibited inadequate skills, and the remainder lay within the category of “marginal.” We found that less than 3% qualified by our definition as “unskilled and unaware of it.”

These results indicate that the popular perspectives found in web searches that portray people in general as having grossly overinflated views of their own competence may be incomplete and perhaps even erroneous. Other researchers are now discovering that the correlations between paired measures of self-assessed competence and actual competence are positive and significant. However, to establish the relationship between self-assessed competency and actual competency appears to require more care in taking the paired measures than many of us researchers earlier suspected.

Do the categories as defined in Figure 1 appear reasonable to other bloggers, or do these conflict with your observations? For instance, where would you place the boundary between “Adequate” and “Inadequate” self-assessment? How would you quantitatively define a person who is “unskilled and unaware of it?” How much should a person overestimate/underestimate before receiving the label of “overconfident” or “underconfident?”

If you have measurements and data, please compare your results with ours before you answer. Data or not, be sure to become familiar with the mathematical artifacts summarized in our January Numeracy paper (linked above) that were mistakenly taken for self-assessment measures in earlier peer-reviewed self-assessment literature.

Our fellow bloggers constitute some of the nation’s foremost thinkers on metacognition, and we value their feedback on how Figure 1 accords with their experiences as we work toward finalizing our sequel paper.


Metacognition for Scholars: How to Engage in Deep Work

By Charity S. Peak, Ph.D. (Independent Consultant)

True confession: I’m addicted to shallow work. I wouldn’t say I’m a procrastinator as much as I am someone who prefers checking small things off my list or clearing my inbox over engaging in more complex tasks. I know I should be writing and researching. It’s just as much of my job as teaching or administrative duties, but I get to the end of my day and wonder why I didn’t have time for the most critical component of my promotion package – scholarship.

It turns out I’m not the only one suffering from this condition (far from it), and luckily there is a treatment plan available. It begins with metacognition about how one is spending time during the day, self-monitoring conditions that are most distracting or fruitful for productivity, and self-regulating behaviors in order to ritualize more constructive habits. Several authors offer suggestions for how to be more prolific (Goodson, 2013; Silvia, 2007), especially those providing writing prompts and 15-minute exercises, but few get to the core of the metacognitive process like Cal Newport’s (2016) recent Deep Work: Rules for Focused Success in a Distracted World. Newport, a professor of computer science at Georgetown and author of 5 books and a blog on college success, shares his strategies for becoming a prolific writer while balancing other faculty duties.

Newport claims that deep work is the ability to focus without distraction on a cognitively demanding task. It is arguably the most difficult and crucial capability of the 21st century. Creative thinking is becoming progressively rare in our distracted world, so those who can rise above shallow work are guaranteed to demonstrate value to their employers, especially colleges and universities. In order to be creative and produce new ideas, scholars must engage in deep work regularly and for significant periods of time. Instead, Newport argues that most people spend their days multitasking through a mire of shallow work like email, which is noncognitively demanding and offers little benefit to academia, let alone an individual’s promotion. In fact, he cites that “a 2012 McKinsey study found that the average knowledge worker now spends more than 60 percent of the workweek engaged in electronic communication and Internet searching, with close to 30 percent of a worker’s time dedicated to reading and answering e-mail alone” (Newport, 2016, p. 5). Sound like someone you know?

The good news is that if you carve out space for deep work, your professional career will soar. The first step is to become metacognitive about how you are spending your time during the day. One simple method is to self-monitor how you use your work days by keeping a grid near your computer or desk. At the end of every hour throughout your day, record how much time you actually spent doing your job duties of teaching (including prep and grading), writing and research, and service. Like a food diary or exercise journal, your shallow work addiction will become apparent quickly, but you will also gain metacognition about when and under which conditions you might attempt to fit in time for deep work.

Once you have a grasp of the issue at hand, you can begin to self-regulate your behavior by blocking off time in your schedule in which you can engage in a deeper level of creative thinking. Each person will gravitate toward a different modality conducive to an individual’s working styles or arrangements. The author offers a few choices for you to consider, which have been proven to be successful for other scholars and business leaders:

  • Monastic: Eliminate or radically minimize shallow obligations, such as meetings and emails, in an effort to focus solely on doing one thing exceptionally well. Put an out-of-office response on your email, work somewhere other than your workplace, or take a year-long sabbatical in order to completely separate from frivolous daily tasks that keep you away from research and writing. Most teaching faculty and academic leaders are unable to be purely monastic due to other duties.
  • Bimodal: Divide your time, dedicating some clearly defined stretches to deep pursuits and leaving the rest open to everything else. During the deep time, act monastically – seek intense and uninterrupted concentration – but schedule other time in your day for shallow work to be completed. One successful scholar shared the possibility of teaching a very full load one semester but not teaching at all during the next as an example of engaging deeply in both critical duties.
  • Rhythmic: Also called the “chain method” or “snack writing,” create a regular habit of engaging in deep work, such as every morning before going into work or at the end of each day. Blocking off one’s calendar and writing every day has been proven to be one of the most productive habits for scholars attempting to balance their research with other duties (Gardiner & Kearns, 2011).
  • Journalistic: Fit deep work into your schedule wherever you can – 15 minutes here, an hour there. Over time you will become trained to shift into writing mode on a moment’s notice. This approach is usually most effective for experienced scholars who can switch easily between shallow and deep work. Inexperienced writers may find that the multitasking yields unproductive results, so they should proceed cautiously with this method.

The key is to do something! You must ritualize whichever method you choose in order to optimize your productivity. This may take some trial and error, but with your new-found metacognition about how you work best and some alternative strategies to try, you will be more likely to self-regulate your behaviors in order to be successful in your scholarly pursuits. If you try new approaches and are still not engaging in enough deep work, consider joining a writing group or finding a colleague to hold you accountable on a regular basis. Again, like diet and exercise, others can sometimes provide the motivation and deadlines that we are unable to provide for ourselves. Over time, your addiction to shallow work will subside and your productivity will soar… or so they tell me.

Resources:

Gardiner, M., & Kearns, H. (2011). Turbocharge your writing today. Nature 475: 129-130. doi: 10.1038/nj7354-129a

Goodson, P. (2013). Becoming an academic writer: 50 exercises for paced, productive, and powerful writing. Los Angeles: Sage.

Newport, C. (2016). Deep work: Rules for focused success in a distracted world. New York: Grand Central Publishing.

Silvia, P. J. (2007). How to write a lot: A practical guide to productive academic writing. Washington, D.C.: American Psychological Association.


The Importance of Teaching Effective Self-Assessment

by Stephen Chew, Ph.D., Samford University,  slchew@samford.edu

Say we have two students who are in the same classes. For sentimental reasons, I’ll call them Goofus and Gallant[i]. Consider how they each react in the following scenarios.

In General Psychology, their teacher always gives a “clicker question” after each section. The students click in their response and the results are projected for the class to see. The teacher then explains the correct answer. Gallant uses the opportunity to check his understanding of the concept and notes the kind of question the teacher likes to use for quizzes. Goofus thinks clicker questions are a waste of time because they don’t count for anything.

In their math class, the teacher always posts a practice exam about a week before every exam. A day or two before the exam, the teacher posts just the answers, without showing how the problems were solved. Gallant checks his answers and if he gets them wrong, he finds out how to solve those problems from the book, the teacher, or classmates. Goofus checks the answers without first trying to work the problem. He tries to figure out how to work backwards from the answer. He considers that good studying. He memorizes the exact problems on the practice exam and is upset if the problems on the exam don’t match them.

In history class, the teacher returns an essay exam along with the grading rubric. Both boys were marked off for answers the teacher did not find sufficiently detailed and comprehensive. Gallant compares his answer to answers from classmates who scored well on the exam to figure out what he did wrong and how to do better next time. Goofus looks at the exam and decides the teacher gives higher scores to students who write more and use bigger words. For the next exam, he doesn’t change how he studies, but he gives long, repetitive answers and uses fancy words even though he isn’t exactly sure what they mean.

In each case, the teacher offers opportunities for improving metacognitive awareness, but the reactions of the two boys is markedly different. Gallant recognizes the opportunity and takes advantage of it, while Goofus fails to see the usefulness of these opportunities and, when given feedback about his performance, fails to take advantage of it. Just because teachers offer opportunities for improving metacognition does not mean that students recognize the importance of the activities or know how to take advantage of them. What is missing is an understanding of self-assessment, which is fundamental to developing effective metacognition.

For educational purposes, self-assessment occurs when students engage in an activity in order to gain insight into their level of understanding. The activity can be initiated either by the student or the teacher. Furthermore, to qualify as self-assessment, the student must understand and utilize the feedback from the activity. In summary, self-assessment involves students learning the importance and utility of self-assessments, teachers or students creating opportunities for self-assessment, and students learning how to use the results to improve their learning (Kostons, van Gog, & Paas, 2012).

Self-assessment is similar to formative assessment, which refers to any low-stakes activity designed to reveal student learning, but there are key differences (Angelo & Cross, 1993). First, students may undergo a formative assessment without understanding that it is an important learning opportunity. In self-assessment, the student understands and values the activity as an aid to learning. Second, students may not appreciate or use feedback from the formative assessment to improve their learning (Karpicke, Butler, & Roediger, 2009). Successful self-assessment involves using the feedback to identify misconceptions and knowledge gaps, and to hone learning strategies (Kostons et al., 2012). Third, even high stakes, summative assessments can be used for self-assessment. For example, students can use the results of an exam to evaluate how successful their learning strategies were and make modifications in preparation for the next exam. Fourth, formative assessments are usually administered by the teacher. Self-assessment can be initiated by either teachers or students. For example, students may take advantage of chapter review quizzes to test their understanding. If students do not understand the importance of self-assessment and how to do it effectively, they will not take advantage of formative assessment opportunities, and they fail to use feedback to improve their learning.

The importance of learning effective self-assessment is grounded in a sound empirical and theoretical foundation. Teaching students to conduct self-assessment will help them to become aware of and correct faulty metacognition, which in turn should contribute to more successful self-regulated learning (see Pintrich, 2004). Self-assessment also involves student recall and application of information, facilitating learning through the testing effect (see Roediger & Karpicke, 2006, for a review). The proper use of feedback has also been shown to improve student learning (Hattie & Yates, 2014). Finally, self-assessment activities can also provide feedback to teachers on the student level of understanding so that they can adjust their pedagogy accordingly.

Teachers play a critical role in both designing rich activities for self-assessment and also teaching students how to recognize valuable opportunities for self-assessment and to take advantage of them. Some activities are more conducive to self-assessment than others. In the psychology class example above, Goofus doesn’t understand the purpose of the clicker question nor the importance of the feedback. The teacher could have used a richer activity with the clicker questions to promote self-assessment (e.g. Crouch & Mazur, 2001). In the math class scenario, the teacher gives a practice exam, but only gives the correct answer for feedback. Richer feedback would model the reasoning needed to solve the problems (Hattie & Yates, 2014) and support self-assessment. And even when feedback is given, students need to learn how to use the feedback effectively and avoid misconceptions, such as in the history class example where Goofus wrongly concludes the teacher wants longer answers with fancy words.

I believe effective self-assessment is a critical link between assessment activities and improved metacognition. It is link that we teachers often fail to acknowledge. I suspect that effective teachers teach students how to carry out self-assessment on their understanding of course content. Less effective teachers may provide self-assessment opportunities, but they are either not effectively designed, or students may not recognize the importance of these opportunities or know how to take advantage of them.

There is not a lot of research on how to teach effective self-assessment. The existing research tends to focus mainly on the providing self-assessment opportunities and not how to get students to make use of them. I believe research on self-assessment would be highly valuable for teachers. Some of the key research questions are:

  • How can students be convinced of the importance of self-assessment?
  • Can self-assessment improve metacognition and self-regulation?
  • Can self-assessment improve student study strategies?
  • Can self-assessment improve long-term learning?
  • What are the best ways to design and implement self-assessments?
  • When and how often should opportunities for self-assessment be given?
  • What kind of feedback is most effective for different learning goals?
  • How can students be taught to use the feedback from self-assessments effectively?

Two fundamental learning challenges for college students, especially first-year students, are poor metacognitive awareness and poor study strategies (Kornell & Bjork, 2007; McCabe, 2011). The two problems are connected because using a poor study strategy increases false confidence without increasing learning (Bjork, Dunlosky, & Kornell, 2013). Improving both metacognitive awareness and study strategies of students is difficult to do (Susser & McCabe, 2013). I believe a promising but little studied intervention is to teach students the importance and the means of conducting effective self-assessment.

References

Angelo, T. A. and K. P. Cross (1993). Classroom Assessment Techniques: A Handbook for College Teachers, Jossey-Bass.

Bjork, R. A., Dunlosky, J., & Kornell, N. (2013). Self-regulated learning: Beliefs, techniques, and illusions. Annual Review of Psychology, 64, 417-444.

Crouch, C. H., & Mazur, E. (2001). Peer Instruction: Ten years of experience and results. American Journal of Physics, 69, 970-977.

Hattie, J. A. C., & Yates, G. C. R. (2014). Using feedback to promote learning. In V. Benassi, C. E. Overson, & C. M. Hakala (Eds.). Applying the science of learning in education: Infusing psychological science into the curriculum. Retrieved from the Society for the Teaching of Psychology web site: http://teachpsych.org/ebooks/asle2014/index.php.

Karpicke, J. D., Butler, A. C., & Roediger, H. L. III. (2009). Metacognitive strategies in student learning: Do students practise retrieval when they study on their own? Memory, 17, 471-479.

Kornell, N., & Bjork, R. A. (2007). The promise and perils of self-regulated study. Psychonomic Bulletin & Review, 6, 219-224.

Kostons, D., van Gog, T., Paas, F. (2012). Training self-assessment and task-selection skills: A cognitive approach to improving self-regulated learning. Learning and Instruction, 22, 121-132.

McCabe, J. (2011). Metacognitive awareness of learning strategies in undergraduates. Memory & Cognition, 39, 462-476.

Pintrich, P. R. (2004). A conceptual framework for assessing motivation and self-regulated learning in college students. Educational Psychology Review, 16, 385-407.

Roediger, H. L., III., & Karpicke, J. D. (2006). The power of testing memory: Basic research and implications for educational practice. Perspectives on Psychological Science, 1, 181-210.

Susser, J. A., & McCabe, J. (2013). From the lab to the dorm room: Metacognitive awareness and use of spaced study. Instructional Science, 41, 345-363.

[i] Goofus and Gallant are trademarked names by Highlights for Children, Inc. No trademark infringement is intended. I use the names under educational fair use. As far as I know, Goofus and Gallant have never demonstrated good and poor metacognition.


Are Academic Procrastinators Metacognitively Deprived?

By Aaron S. Richmond
Metropolitan State University of Denver

Academic Procrastinators Brief Overview

One of my favorite articles is Academic Procrastination of Undergraduates: Low Self-Efficacy to Self-Regulate Predicts Higher Levels of Procrastination by Robert M. Klassen, Lindsey. L. Krawchuk, and Sukaina Rajani (2007). Klassen and colleagues state that “…the rate for problematic academic procrastination among undergraduates is estimated to be at least 70-95% (Ellis & Knaus, 1977; Steel, 2007), with estimates of chronic or severe procrastination among undergraduates between 20% and 30%” (p. 916). Academic procrastination is, “the intentional delay of an intended course action, in spite of an awareness of negative outcomes (Steel, 2007; as cited in Klassen et al., 2006, p. 916). Based on the above stated statistics, it is obvious that academic procrastination is an issue in higher education, and that understanding what factors influence it and are related to its frequency is of utmost importance.

In their 2007 article, Klassen and colleagues conducted two studies to understand the relationship among academic procrastination and self-efficacy, self-regulation, and self-esteem and then understand this relationship within “negative procrastinators” (p. 915). In study 1, they surveyed 261 undergraduate students. They found that academic procrastination was inversely correlated to college/university GPA, self-regulation, academic self-efficacy and self-esteem. Meaning as students’ frequency of academic procrastination went down, their GPA and self-reported scores of self-efficacy, self-esteem, and self-regulation went up. They also found that self-regulation, self-esteem, and self-efficacy predicted academic procrastination.

In study 2, Klassen and colleagues (2007) they were interested in knowing whether there was a difference between negative and neutral procrastinators. That is when procrastinating caused a negative affect (e.g., grade penalty for assignment tardiness) or a neutral affect (e.g., no penalty for assignment tardiness). They surveyed 194 undergraduates and asked students to rate how academic procrastination affected, either positively or negatively, specific academic tasks (reading, research, etc.). They then, divided the sample into a group of students that self-reported that academic procrastination negatively affected them in some way or positive/neutrally affected them in some way.  What they found is that there were significant differences in GPA, daily procrastination, task procrastination, predicted class grade, actual class grade, and self-reported self-regulation between negative procrastinators and neutral procrastinators. They also found that students most often procrastinated on writing tasks.

So Where Does Metacognition Come in to Play?

Because a main factor of their focus was self-regulation, I think Klassen and colleagues study, gives us great insight and promise into the potential role (either causal or predictive) that metacognition plays in academic procrastination. First, in Study 1, they used the Motivated Strategies for Learning Questionnaire (MSLQ; Pintrich, Smith, Garcia, & MckKeachie, 1993) to measure self-efficacy for self-regulation. This MSLQ subscale assesses students’ awareness of knowledge and control of cognition (Klassen et al., 2007). It asks question like “If course materials are difficult to understand, I change the way I read the material.” or “I try to change the way I study in order to fit the course requirements and instructor’s teaching style.” (p. 920). As self-efficacy for self-regulation are a subset of metacognition, it is clear to me, that these questions indirectly, if not directly, partially measure elements of metacognition.

This makes me wonder, it would be interesting if the results of Klassen et al.’s study hold true with other forms of metacognition, such as metacognitive awareness. For example, how does it relate to metacognitive awareness factors that Schraw and Dennison (1994) suggest, such as knowledge and cognition (e.g., declarative knowledge, procedural knowledge, conditional knowledge) vs. regulation of cognition (e.g., planning, information management, monitoring, evaluation)?  Or, as Klassen et al. did not use the entire battery of measures in the MSLQ, how does academic procrastination relate to other aspects of the MSLQ like Learning Strategies, Help Seeking Scale, Metacognitive Self-Regulation, etc. (Pintrich et al., 1993). Or how might Klassen’s results relate to behavioral measures of metacognition such as calibration or, how does it relate to the Need for Cognition (Cacioppo & Petty, 1982)?  These questions suggest that metacognition could play a very prominent role in academic procrastination.

There Are Always More Questions Than Answers

To my knowledge, researchers have yet to replicate Klassen et al.’s (2007) with an eye towards investigating whether metacognitive variables predict and mediate rates of academic procrastination.  Therefore, I feel like I must wrap up this blog (as I always do) with a few questions/challenges/inspirational ideasJ

  1. What is the relationship among metacognitive awareness and academic procrastination?
  2. If there is a relationship between metacognition and academic procrastination, are there mediating and moderating variables that contribute to the relationship between metacognition and academic procrastination? For example, critical thinking? Intelligence? Past academic performance? The type of content and experience with this content (e.g., science knowledge)?
  3. Are there specific elements of metacognition (e.g., self-efficacy vs. metacognitive awareness vs. calibration, vs. monitoring, etc.) that predict the frequency of academic procrastination?
  4. Can metacognitive awareness training reduce the frequency of academic procrastination?
  5. If so, what type of training best reduces academic procrastination?

 References

Cacioppo, J. T., & Petty, R. E. (1982). The need for cognition. Journal of Personality and Social Psychology, 42(1), 116.

Ellis, A., & Knaus, W. J. (1977). Overcoming procrastination. NY: New American Library

Klassen, R. M., Krawchuk, L. L., & Rajani, S. (2008). Academic procrastination of undergraduates: Low self-efficacy to self-regulate predicts higher levels of procrastination. Contemporary Educational Psychology, 33, 915-931. doi:10.1016/j.cedpsych.2007.07.001

Pintrich, P. R., Smith, D. A. F., Garcia, T., & McKeachie, W. J. (1993). Reliability and predictive validity of the motivated strategies for learning questionnaire (MSLQ). Educational and Psychological Measurement, 53, 801–813.

Schraw, G., & Dennison, R. S. (1994). Assessing metacognitive awareness. Contemporary Educational Psychology, 19, 460-475.

Steel, P. (2007). The nature of procrastination: A meta-analytic and theoretical review of quintessential self regulatory failure. Psychological Bulletin, 133, 65–94.


The Challenge of Deep Learning in the Age of LearnSmart Course Systems

by Lauren Scharff, Ph.D. (U. S. Air Force Academy)

One of my close friends and colleague can reliably be counted on to point out that students are rational decision makers. There is only so much time in their days and they have full schedules. If there are ways for students to spend less time per course and still “be successful,” they will find the ways to do so. Unfortunately, their efficient choices may short-change their long-term, deep learning.

This tension between efficiency and deep learning was again brought to my attention when I learned about the “LearnSmart” (LS) text application that automatically comes with the e-text chosen by my department for the core course I’m teaching this semester. As a plus, the publisher has incorporated learning science (metacognitive prompts and spacing of review material) into the design of LearnSmart. Less positive, some aspects of the LearnSmart design seem to lead many students to choose efficiency over deep learning.

In a nutshell, the current LS design prompts learning shortcuts in several ways. Pre-highlighted text discourages reading from non-highlighted material, and the fact that the LS quiz questions primarily come from highlighted material reinforces those selective reading tendencies. A less conspicuous learning trap results from the design of the LS quiz credit algorithm that incorporates the metacognitive prompts. The metacognition prompts not only take a bit of extra time to answer, but students only get credit for completing questions for which they indicate good understanding of the question material. If they indicate questionable understanding, even if they ultimately answer correctly, that question does not count toward the required number of pre-class reading check questions. [If you’d like more details about the LS quiz process design, please see the text at the bottom of this post.]

Last semester, the fact that many of our students were choosing efficiency over deep learning became apparent when the first exam was graded. Despite very high completion of the LS pre-class reading quizzes and lively class discussions, exam grades on average were more than a letter grade lower than previous semesters.

The bottom line is, just like teaching tools, learning tools are only effective if they are used in ways that align with objectives. As instructors, our objectives typically are student learning (hopefully deep learning in most cases). Students’ objectives might seem to be correlated with learning (e.g. grades) or not (e.g. what is the fastest way to complete this assignment?). If we instructors design our courses or choose activities that allow students to efficiently (quickly) complete them while also obtaining good grades, then we are inadvertently supporting short-cuts to real learning.

So, how do we tackle our efficiency-shortcut challenge as we go into this new semester? There is a tool that the publisher offers to help us track student responses by levels of self-reported understanding and correctness. We can see if any students are showing the majority of their responses in the “I know it” category. If many of those are also incorrect, it’s likely that they are prioritizing short-term efficiency over long-term learning and we can talk to them one-on-one about their choices. That’s helpful, but it’s reactionary.

The real question is, How do we get students to consciously prioritize their long-term learning over short-term efficiency? For that, I suggest additional explicit discussion and another layer of metacognition. I plan to regularly check in with the students, have class discussions aimed at bringing their choices about their learning behaviors into their conscious awareness, and positively reinforcing their positive self-regulation of deep-learning behaviors.

I’ll let you know how it goes.

——————————————–

Here is some additional background on the e-text and the complimentary LearnSmart (LS) text .

There are two ways to access the text. One way is an electronic version of the printed text, including nice annotation capabilities for students who want to underline, highlight or take notes. It’s essentially an electronic version of a printed text. The second way to access the text is through the LS chapters. As mentioned above, when the students open these chapters, they will find that some of the text has already been highlighted for them!

As they read through the LS chapters, students are periodically prompted with some LS quiz questions (primarily from highlighted material). These questions are where some of the learning science comes in. Students are given a question about the material. But, rather than being given the multiple choice response options right away, they are first given a metacognitive prompt. They are asked how confident they are that they know the answer to the question without seeing the response options. They can choose “I know it,” “Think so,” “Unsure,” or “No idea.” Once they answer about their “awareness” of their understanding, then they are given the response options and they try to correctly answer the question.

This next point is key: it turns out that in order to get credit for question completion in LS, students must do BOTH of the following: 1) choose “I know it” when indicating understanding, and 2) answer the question correctly. If students indicate any other level of understanding, or if they answer incorrectly, LS will give them more questions on that topic, and the effort for that question won’t count towards completion of the required number of questions for the pre-class activity.

And there’s the rub. Efficient students quickly learn that they can complete the pre-class reading quiz activity much more quickly if they chose “I know it” to all the metacognitive understanding probes prior to each question. If they guess at the subsequent question answer and get it correct, it counts toward their completion of the activity and they move on. If they answer incorrectly, LS would give them another question from that topic, but they weren’t any worse off with respect to time and effort than if they had indicated that they weren’t sure of the answer.

If students actually take the time to take advantage of rather than shortcut the LS quiz features (there are additional ones I haven’t mentioned here), their deep learning should be enhanced. However, unless they come to value deep learning over efficiency and short-term grades (e.g. quiz completion), then there is no benefit to the technology. In fact it might further undermine their learning through a false sense of understanding.


Metacognition in Academic and Business Environments

by Dr. John Draeger, SUNY Buffalo State

I gave two workshops on metacognition this week — one to a group of business professionaIMG_20160107_103443740_HDRls associated with the Organizational Development Network of Western New York and the other to faculty at Genesee Community College. Both workshops began with the uncontroversial claim that being effective (e.g., in business, learning, teaching) requires finding the right strategy for the task at hand. The conversation centered around how metacognition can help make that happen. For example, metacognition encourages us to be explicit and intentional about our planning, to monitor our progress, to make adjustments along the way, and to evaluate our performance afterwards. While not a “magic elixir,” metacognition can help us become more aware of where, when, why, and how we are or are not effective.


As I prepared for these two workshops, I decided to include one of my favorite metacognition resources as a key part of the workshop. Kimberly Tanner (2012) offers a series of questions that prompt metacognitive planning, monitoring, and evaluation. By way of illustration, I adapted Tanner’s questions to a familiar academic task, namely reading. Not all students are as metacognitive as we would like them to be. When asked to complete a reading assignment, for example, some students will interpret the task as turning a certain number of pages (e.g., read pages 8-19). They read the words, flip the page, and the task is complete when they reach the end. Savvy students realize that turning pages is not much of a reading strategy. They will reflect upon their professor’s instructions and course objectives. These reflections can help them intentionally adopt an appropriate reading strategy. In short, these savvy students are engaging in metacognition. They are guided by Tanner-style questions in table below.  

 

Table: Using metacognition to read more effectively (Adapted from Tanner, 2012)

Task Planning Monitoring Evaluating
Reading What do I already know about this topic?

How much time do I need to complete the task?

What strategies do I intend to use?

What questions are arising?

Are my strategies working?

What is most confusing?

Am I struggling with motivation or content?

What other are strategies are available?

To what extent did I successfully complete the task?

To what extent did I use the resources available to me?

What confusions do I have that still need to be clarified?

What worked well?

Big picture Why is it important to learn this material?

How does this reading align with course objectives?

To what extent has completing this reading helped me with other learning goals? What have I learned in this course that I could use in the future?

 

After considering the table with reference to student reading, I asked the business group how the table might be adapted to a business context. They pointed out that middle managers are often flummoxed by company initiatives that either lack specificity or fail to align with the company’s mission and valIMG_3929ues. This is reminiscent of students who are paralyzed by what they take to be an ill-defined assignment (e.g., “write a reflection paper on what you just read”). Like the student scrambling to write the paper the night before, business organizations can be reactionary. Like the student who tends to do what they’ve done before in there other classes (e.g., put some quotations in a reflection paper to make it sound fancy), businesses are often carried along by organizational culture and past practice. W hen facing adversity, for example, organizational structure often suggests that doing something now (anything! Just do it!) is preferable to doing nothing at all. Like savvy students, however, savvy managers recognize the importance of explicitly considering and intentionally adapting response strategies most likely to further organizational goals. This requires metacognition and adapting the Tanner-style table is a place to start.

When I discussed the Tanner-style table with the faculty at Genesee Community College, they offered a wide-variety of suggestions concerning how the table might be adapted for use in their courses. For example, some suggested that my reading example presupposed that students actually complete their rIMG_3939eading assignments. They offered suggestions concerning how metacognitive prompts could be incorporated early in the course to bring out the importance of the reading to mastery of course material. Others suggested that metacognitive questions could be used to supplement prepackaged online course materials. Another offered that the he sometimes “translates” historical texts into more accessible English, but he is not always certain whether this is good for students. In response, someone pointed out that metacognitive prompts could help the faculty member more explicitly formulate the learning goals for the class and then consider whether the “translated” texts align with those goals.

In both business and academic contexts, I stressed that there is nothing “magical” about metacognition. It is not a quick fix or a cure-all. However, it does prompt us to ask difficult and often uncomfortable questions about our own efficacy. For example, participants in both workshops reported a tendency that all of us have to want to do things “our own way” even when this is not most effective. Metacognition puts us on the road towards better planning, better monitoring, better acting, and better alignment with our overall goals.

 

Thinking about thinking in both business and academic environments Share on X

References

Tanner, K. D. (2012). Promoting student metacognition. CBE-Life Sciences Education, 11(2), 113-120.


Quantifying Metacognition — Some Numeracy behind Self-Assessment Measures

Ed Nuhfer, Retired Professor of Geology and Director of Faculty Development and Director of Educational Assessment, enuhfer@earthlink.net, 208-241-5029

Early this year, Lauren Scharff directed us to what might be one of the most influential reports on quantification of metacognition, which is Kruger and Dunning’s 1999 “Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments.” In the 16 years that since elapsed, a popular belief sprung from that paper that became known as the “Dunning-Kruger effect.” Wikipedia describes the effect as a cognitive bias in which relatively unskilled individuals suffer from illusory superiority, mistakenly assessing their ability to be much higher than it really is. Wikipedia thus describes a true metacognitive handicap in a lack of ability to self-assess. I consider Kruger and Dunning (1999) as seminal because it represents what may be the first attempt to establish a way to quantify metacognitive self-assessment. Yet, as time passes, we always learn ways to improve on any good idea.

At first, quantifying the ability to self-assess seems simple. It appears that comparing a direct measure of confidence to perform taken through one instrument with a direct measure of demonstrated competence taken through another instrument should do the job nicely. For people skillful in self-assessment, the scores on both self-assessment and performance measures should be about equal. Seriously large differences can indicate underconfidence on one hand or “illusory superiority” on the other.

The Signal and the Noise

In practice, measuring self-assessment accuracy is not nearly so simple. The instruments of social science yield data consisting of the signal that expresses the relationship between our actual competency and our self-assessed feelings of competency and significant noise generated by our human error and inconsistency.

In analogy, consider signal as your favorite music on a radio station, the measuring instrument as your radio receiver, the noise as the static that intrudes on your favorite music, and the data as the actual sound mix of noise and signal that you hear. The radio signal may truly exist, but unless we construct suitable instruments to detect it, we will not be able to generate convincing evidence that the radio signal even exists. Failures can lead to the conclusions that metacognitive self-assessment is no better than random guessing.

Your personal metacognitive skill is analogous to an ability to tune to the clearest signal possible. In this case, you are “tuning in” to yourself—to your “internal radio station”—rather than tuning the instruments that can measure this signal externally. In developing self-assessment skill, you are working to attune your personal feelings of competence to reflect the clearest and most accurate self-assessment of your actual competence. Feedback from the instruments has value because they help us to see how well we have achieved the ability to self-assess accurately.

Instruments and the Data They Yield

General, global questions such as: “How would you rate your ability in math?” “How well can you express your ideas in writing?” or “How well do you understand science?” may prove to be crude, blunt self-assessment instruments. Instead of single general questions, more granular instruments like knowledge surveys that elicit multiple measures of specific information seem needed.

Because the true signal is harder to detect than often supposed, researchers need a critical mass of data to confirm the signal. Pressures to publish in academia can cause researchers to rush to publish results from small databases obtainable in a brief time rather than spending the time, sometimes years, needed to generate the database of sufficient size that can provide reproducible results.

Understanding Graphical Depictions of Data

Some graphical conventions that have become almost standard in the self-assessment literature depict ordered patterns from random noise. These patterns invite researchers to interpret the order as produced by the self-assessment signal. Graphing of nonsense data generated from random numbers in varied graphical formats can reveal what pure randomness looks like when depicted in any graphical convention. Knowing the patterns of randomness enables acquiring the numeracy needed to understand self-assessment measurements.

Some obvious questions I am anticipating follow: (1) How do I know if my instruments are capturing mainly noise or signal? (2) How can I tell when a database (either my own or one described in a peer-reviewed publication) is of sufficient size to be reproducible? (3) What are some alternatives to single global questions? (4) What kinds of graphs portray random noise as a legitimate self-assessment signal? (5) When I see a graph in a publication, how can I tell if it is mainly noise or mainly signal? (6) What kind of correlations are reasonable to expect between self-assessed competency and actual competency?

Are There Any Answers?

Getting some answers to these meaty questions requires more than a short blog post, but some help is just a click or two away. This blog directs readers to “Random Number Simulations Reveal How Random Noise Affects the Measurements and Graphical Portrayals of Self-Assessed Competency” (Numeracy, January 2016) with acknowledgments to my co-authors Christopher Cogan, Steven Fleisher, Eric Gaze and Karl Wirth for their infinite patience with me on this project. Numeracy is an open-source journal, and you can download the paper for free. Readers will likely see self-assessment literature in different ways way after reading the article.


Metacognition in STEM courses: A Developmental Path

by Roman Taraban, PHD, Texas Tech University

There is a strong focus in science, technology, engineering, and math (STEM) courses to solve problems (Case & Marshall, 2004). Does problem solving in STEM involve metacognition? I argue that the answer must surely be ‘yes’. That’s because metacognition involves monitoring the effectiveness of learning and problem-solving strategies and using metacognitive knowledge to regulate behavior (Draeger, 2015). But when does metacognition become part of problem solving, and how does it come about? Can we discern development in metacognitive monitoring and regulation? In this post, I will present some qualitative data from a study on problem-solving in order to reflect on these questions. The study I draw from was not about metacognition per se, however, it may provide some insights into the development of metacognition.

The study I conducted involved freshman engineering majors. These students were asked to solve typical problems from the course in mechanics in which they were currently enrolled (Taraban, 2015). Not surprisingly, students varied in how they began each problem and how they proceeded towards a solution. In order to gain some insight into their problem-solving strategies, I asked students to simply state why they started with the equation they chose and not some other equation, after they had solved the problems.

Students’ responses fell into at least three types, using labels from Case and Marshall (2004): surface, algorithmic, and deep conceptual. When asked why they started with their first equation, some students responded:

  • “I don’t know, it’s just my instinct”.
  • “No special reason. I’m just taking it randomly”.
  • “It’s just habit.”
  • “The first thing that came to my mind.”

Of interest here, these students did not appear to reflect on the specific problem or show evidence of modulating their behavior to the specific problemheir responses fit a surface learning approach: “no relationships sought out or established, learn by repetition and memorization of formulae” (Case & Marshall, 2004, p. 609).

Other students’ responses reflected an algorithmic approach to learning — “identifying and memorizing calculation methods for solving problems” (Case & Marshall, 2004, p. 609):

  • “I am getting three variables in three unknowns so I can solve it.”

Here the student verbally expresses a more structured approach to the problem. The student believes that he needs three equations involving three unknowns and uses that as a goal. Students who take an algorithmic approach appear to be more reflective and strategic about their solutions to problems, compared to surface problem solvers.

Case and Marshall (1995) regarded both the surface and algorithmic pathways as part of development towards deeper understanding of domain concepts and principles, the latter which they labeled the conceptual deep approach to learning: “relating of learning tasks to their underlying concepts or theory” with the intention “to gain understanding while doing this” (p. 609). Basically, their suggestion is that at some point students recognize that a goal of learning is to understand the material more deeply, and that this recognition guides how they learn. Case and Marshall’s description of conceptual deep learning fits Draeger’s (2015) earlier suggestion that monitoring the effectiveness of learning and regulating one’s behavior is characteristic of metacognitive thinking. Once students reach this level, we should be able to more readily observe students’ intentions to understand the material and observe their overt attempts to grasp the material through their explicit reflection and reasoning. Examples of this type of reflection from my study could be gleaned from those students who did not jump directly to writing equations without first thinking about the problem:

  • “If I choose the moment equation first, then directly I am getting the value of F. So in the other equations I can directly put the value of F.”

As students progress from surface to algorithmic to deep conceptual processing, there is certainly development. However, in the present examples that track that development, it is difficult to partial out students’ thinking about the problem content from their thinking-about-thinking, that is, their metacognitions. Draeger (2015) helps here by distinguishing between metacognition and critical thinking. The latter often requires domain-specific knowledge. Draeger suggests that “many students are able to solve complex problems, craft meaningful prose, and create beautiful works of art without understanding precisely how they did it” (p. 2). Basically, critical thinking is about methodology within a domain – e.g., the person knows how to format a narrative or select an appropriate statistical procedure, without necessarily reflecting on the effectiveness of those choices, that is, without metacognition. In the examples I provided above from my work with undergraduates on problem solving, there is invariably a mix of critical thinking and metacognition. Draeger’s distinction signals a need to better decouple these two distinct kinds of cognitive processes in order to better clarify the developmental trajectory of metacognitive processing in problem solving.

Finally, why do we observe such wide variance in students’ approaches to problem-solving, and, relatedly, to metacognition? One reason is that instructors may emphasize assessment and grades (Case & Marshall, 2004). As a consequence, students may focus more on gaining points for the correct answer rather than on the process. Welsh (2015) has suggested that course structure can act as a barrier to deeper learning: “high stakes assessments may overshadow resources designed for metacognitive development” (p. 2). Welsh found that students were more concerned with test performance than with reflecting upon their study strategies and implementing learning strategies recommended by the instructor.

How are we to understand this discord between concern with test performance and metacognition? At some level, when students set goals to do well on tests they are regulating their behavior. Metacognitive resources from the instructor may be in competition with students’ perceived resources (e.g., access to old tests, study buddies, cramming the night before). The instructor can facilitate change, but the leap from surface and algorithmic learner to deep conceptual learner must be undertaken by the student.

Passion and commitment to a topic are strong motivators to find the means to access and acquire deeper conceptual understanding. One measure of teacher success is class test performance, but another can be found in student comments. Here is one that I recently received that I found encouraging: Despite the fact that I was a bit uninterested in the subject matter, this was one of my favorite classes. By the end of the semester, not only was I interested in the subject matter, I was fascinated by it. Perhaps as instructors we need to facilitate good metacognitive practices but also nurture interest in what we teach in order to motivate students to pursue it more deeply through more effective metacognitive practices.

References

Case, J., & Marshall, D. (2004). Between deep and surface: procedural approaches to learning in engineering education contexts. Studies in Higher Education, 29(5), 605-615.

Draeger, J. (2015). Two forms of ‘thinking about thinking’: metacognition and critical thinking. Retrieved from https://www.improvewithmetacognition.com/two-forms-of-thinking-about-thinking-metacognition-and-critical-thinking/ .

Taraban, R. (2015, November). Transition from means-ends to working-forward problem solving. 56th Annual Conference of the Psychonomic Society. Chicago, IL.

Welsh, A. (2015). Supports and barriers to students’ metacognitive development in a large intro chemistry course. Retrieved from https://www.improvewithmetacognition.com/supports-and-barriers-to-students-metacognitive-development-in-a-large-intro-chemistry-course/


Lean Forward, but Do It Metacognitively!

by Lauren Scharff, Ph.D. (U. S. Air Force Academy)

As the Director for the Scholarship of Teaching and Learning (SoTL) at my institution, a large part of my job description involves helping faculty intentionally explore new approaches and how they impact student learning. In other words – I work with forward-leaning faculty who are ready to try new things. So, I think a lot about how, when, and why faculty members adopt new pedagogies, tools, and activities, and about when, for whom, and in what contexts these new approaches enhance learning. This work dovetails nicely with the development and goals of metacognitive instruction.

As a reminder if you’re relatively new to our site, one of the premises we’ve previously shared here (e.g. Scharff, March 2015) and elsewhere (Scharff and Draeger, NTLF, 2015) is that Metacognitive Instruction involves the intentional and ongoing interaction between awareness and self-regulation, specifically with respect to the pedagogical choices instructors make as they design their lessons and then as they carry them out.

I was happy to see these connections reinforced last month at our 7th Annual SoTL Forum. Dr. Bridget Arend was invited to give a morning workshop and the keynote address. Along with James R. Davis, she is co-author of Facilitating Seven Ways of Learning: A Resource for More Purposeful, Effective and Enjoyable College Teaching. In her workshop Bridget dug into how to facilitate critical thinking, promote problem-solving, and support the building of skills (3 of the 7 ways of learning), while in her keynote she focused more strongly on the concept of matching student learning goals with the most effective teaching methods. She went beyond the usual discussion of tips and techniques to explore the underlying purpose, rationale, and best use of these [pedagogical] methods.

Dr. Bridget Arend giving the keynote address at the 7th Annual SoTL Forum at the U. S. Air Force Academy
Dr. Bridget Arend giving the keynote address at the 7th Annual SoTL Forum at the U. S. Air Force Academy

7_Ways_of_Learning
Books such as these can help support metacognitive instruction.

While Bridget did not explicitly use the term “metacognitive instruction,” it struck me that her message of purposeful choice of methods directly supported key aspects of metacognitive instruction, especially those related to awareness of our pedagogical decisions. We (instructors) should not incorporate pedagogies (or new tools or activities) just because they are the ones typically used by our colleagues, or because they are what was “done to us as students and it worked for us,” or because they are the “new, latest-greatest thing” we’ve heard about. Rather, we should carefully review our learning goals and consider how each possible approach might support those goals for our students and our context.

We should also be mindful of other factors that might influence our adoption of new approaches. For example, administrators or institutions often reward faculty who are leading the adoption of new technologies. Sometimes the message seems “the more new technologies incorporated the better” or “out with the old and in with the new” so a program or institution can market itself as being the most cutting edge in education. However, while many of us appreciate being rewarded or showcased for new efforts, we also need to pause to consider whether or not we’re really supporting student learning as well as we could with these practices.

Questions we should ask ourselves before implementation include, How will our new pedagogical approach or a new app really align with the learning goals we have for our students? Will all of our choices complement each other, or might they work at cross-purposes with each other? Realistically, there are a limited number of learning outcomes we can successfully accomplish within a lesson or even across a semester.

As we implement these new approaches and tools, we should ask additional questions. How are they actually impacting aspects of student engagement, attitudes towards learning, and ultimately, the learning itself? How might they be adjusted (either “in the moment” or in future lessons) as we use them in order to better support our learning goals for our students in our context? No group of students is the same, and the context also shifts over time. What worked well in the past might need adjusting or more radically changing in the future.

In sum, we know that no single approach is going to work for all learning goals or all students across all situations. But if we build our awareness of possibilities using resources such as Facilitating Seven Ways of Learning (and many other published papers and texts) to help guide our pedagogical choices; if we carefully attend to how our approaches affect students and student learning; and we if modify our approach based on those observations (and maybe using systematic data if we’re conducting a SoTL research project), then we WILL be more likely to enhance student learning (and our own development as metacognitive instructors).

Thus, lean forward as instructors, but do it metacognitively!

————————-

Davis, James R. & Arend, B. (2013). Facilitating Seven Ways of Learning: A Resource for More Purposeful, Effective and Enjoyable College Teaching. Stylus Publishing, Sterling, VA.

Scharff, L. & Draeger, J. (September, 2015). Thinking about metacognitive instruction. The National Teaching and Learning Forum, 24(5), p. 4-6. http://onlinelibrary.wiley.com/doi/10.1002/ntlf.2015.24.issue-5/issuetoc


Teaching a new course requires metacognition

by John Draeger, SUNY Buffalo State

One of the joys of being an academic philosopher is the freedom to explore new ideas. For example, the recent retirement of a colleague left a gap in my department’s usual offerings. I agreed to take over a course on the philosophy of love and sex. While I have written scholarly articles on related topics, I confess that teaching this new material had me feeling the sort of constructive discomfort that I seek to foster in my students (Draeger 2014). As a result, I experienced a heightened sense of awareness concerning what I was doing and why. In particular, I came to believe that teaching a new course requires metacognition.

As I sat down to construct the course, I was guided by the thought that philosophy can help students learn to have careful conversations about ideas that matter. With respect to this new course, I wanted students to learn to ask tough questions. Can we really promise to love someone forever? Can sex ever be meaningless? Is becoming emotionally attached to someone other than your partner worse than sleeping around? Is it possible to love more than one person at the same time or does romantic love require some form of exclusivity? Such questions prompt students to consider whether commonly held beliefs are actually justified. If these views withstand scrutiny, then students have the conceptual resources to offer a proper defense. If not, then students can begin searching for ideas worth having. Such questions can also open up a larger conversation about related concepts (e.g., trust, intimacy, respect, jealousy, loyalty).  Because much of the course material was new to me, I had not always thought through the various permutations and implications of each philosophical position. I often found myself learning “on the fly” along with my students as I reflected on my own assumptions and preconceived ideas in “real time” while the discussion unfolded in front of me.

In an earlier post (Draeger 2015), I argued that “critical thinking involves an awareness of mode of thinking within a domain (e.g., question assumptions about gender, determine the appropriateness of a statistical method), while metacognition involves an awareness of the efficacy of particular strategies for completing that task.” As I reflect on my philosophy of love and sex course, I realize that my heightened awareness contained elements of both critical thinking and metacognition. Because the material was largely new to me, I was more aware of my own critical thinking processes as I engaged in them and more “tuned into” what my students were grappling with (e.g., assumptions about love and sex, related concepts, implications of the view we are considering). I also found myself metacognitively evaluating whether my students were critically engaged and whether my choices were moving the conversation in philosophically fruitful directions. I like to think that this sort of monitoring happens in all of my classes, but I was acutely aware of its importance given that the material was unfamiliar and my discussion prompts were untested. Moreover, I like to think that I never resort to autopilot and that I am always keenly aware of fluid learning environments. However, because the material was so fresh, I could not help but engage in self-regulation. I did not have a reliable stock of examples and responses at my fingertips. Even more than usual, I found myself making intentional changes to my approach based on “in-the-moment” feedback from students (Scharff 2015).

Teaching a new course always rejuvenates me because it reminds me how much I love to learn. As the teacher, however, I was responsible for more than my own learning. Effective teaching requires thinking about the critical thinking processes of all the learners in the room, including my own. It also requires monitoring fluid learning environment and making intentional changes (often in-the-moment changes) if students are to have careful conversations about ideas that matter (e.g., love, sex). While teaching with metacognition is generally a good idea, this semester taught me that teaching a new course requires metacognition.

Teaching a new course requires metacognition Share on X

References

Draeger, John (2015). “Two forms of ‘thinking about thinking’: metacognition and critical thinking.” Retrieved from https://www.improvewithmetacognition.com/two-forms-of-thinking-about-thinking-metacognition-and-critical-thinking

Draeger, John (2014). “Cultivating a habit of constructive discomfort.” Retrieved from https://www.improvewithmetacognition.com/cultivating-a-habit-of-constructive-discomfort
Scharff, Lauren (2015). “What do we mean by ‘metacognitive instruction?” Retrieved from https://www.improvewithmetacognition.com/what-do-we-mean-by-metacognitive-instruction/


Forging connections with students through metacognitive assignments

by Diane K. Angell, St. Olaf College

We have all likely shared the experience, early in our teaching career, of a gaggle of students gathering at our office door after an exam. “I studied for so many hours!” “I came to class everyday.” “I always did well in high school.” Students also seemed to struggle ahead of exams as they tried to learn and master scientific material. “What should I study?” “Why can’t you just give us a study guide?” I was often perplexed by these frustrations. I wondered and tried to recall how I had learned material and strategized as a science student preparing for the inevitable exams in larger introductory college courses.

That same month, I found myself at a conference, the Accredited Colleges of the Midwest’s Teagle Collegium on Student Learning. The focus was very much on metacognition. Although as a biologist, I struggled to understand the details of several presentations, it all sounded very familiar. Perhaps this was what my students were missing? I appreciated the intellectual points and took copious notes, until my mind began to wander. I needed to prepare to host a large group for Thanksgiving in the coming days. How should I start? What did I need to purchase and where would I get it? What needed to be prepared and cooked when, so that all the different dishes were ready and warm when it was actually time to sit down and eat? I began to get anxious. I quickly realized two things. Focusing back on my students, I immediately appreciated the degree to which preparing a Thanksgiving meal, and preparing to take an exam are both complex metacognitive tasks. I could finally imagine what my students were feeling and understand the metacognitive challenges exams present to them. Students need to evaluate what they know, what they don’t know and how best to approach any material they are uncertain of. And unlike cooking and meal preparation, there are no clear simple sets of directions highlighting how to approach the task of taking a typical college classroom exam. Second, my own pre-Thanksgiving meal mental preparation check made me realize that I have likely been using such metacognitive skills since I was a student, but was just not aware I was using them. Perhaps I did have some wisdom to share and upon returning to campus I committed to using a metacognition approach to help students prepare for exams.

Introductory college biology courses are an excellent place to begin engaging students with a metacognitive approach to exam preparation. These classes will probably always have exams. Moreover, as students move on in biology they are likely to face even more challenging exams. In order to engage students in metacognitive practices I came up with a series of straightforward metacognitive prompts that I emailed to students before each exam. They included simple questions such as: How do you think you will start studying? What techniques will you use while studying? What was the most difficult topic in this section of the course and why was it difficult? How will you approach the material you do not yet understand?

I found their responses fascinating. Some clearly wrote as little as possible, but most wrote quite extensively sharing with me precise details of how they had studied (or not studied) to prepare for the exam. Many responses were surprisingly sincere and confessional. The assignments brought home to me two points that have left a lasting impression. First, I was reminded of the importance of establishing a connection with students as well as the importance of that connection to student learning. Their emailed responses helped me get to know them in a way that was very different than in the public arena of class or lab. They let me in on their personal narrative of test preparation. I sometimes felt as if I was reading a secret diary. They were honest with me in their emails about what their studying experiences had been, perhaps even more so than if they had come to see me in person. Perhaps the proliferation of email, texting and Facebook has made students more comfortable conversing through a keyboard with faculty than face to face. After responding to the emailed questions, many did eventually come in to chat and engage with me about study strategies and differences they were noticing between high school and college. They seemed to think they knew me better and that I knew them better. Upon arriving in my office, they would frequently refer back to their emailed responses, even though I sometimes struggled to remember exactly who had emailed me what details. The emails seemed to prompt a unique relationship and they saw me as someone who was interested in them as an individual, an attitude that likely helped them feel as if they were part of the learning community in the classroom.

I also came to understand that that the task of mastering material in order to prepare for an exam has become more complicated. In the past, we had a textbook and we had notes from class. That was it. Today this task really is fraught with complex decisions. Students in college classrooms are less likely to be taking notes in a traditional lecture format. They are more likely to be engaged during class in small group discussions and problem based learning activities. They have access to and are justly encouraged to use the online resources that come with texts and take advantage of other online resources. They are also frequently encouraged to form study groups to discuss their understanding of topics outside of class. These are great ways for students to engage with material, and prepare for exams. This diverse learning landscape can be a lifesaver for some students, but for others, when it comes time to prepare for an exam, the variety of options for studying can be overwhelming and paralyzing. As we have opened up new ways of teaching and learning, we may have left students with many different resources at their fingertips but failed to help them think metacognitively about what works for them as they master knowledge to prepare for a summative exam.

Both the stronger connections I made with my students and my better understanding of the diverse exam preparation choices they must make help me feel better prepared to mentor and advise students as they navigate their introductory biology course. By engaging students metacognitively in emails concerning their exam preparation I gained a deeper understanding about how students were learning in my class. Their sincere and thoughtful responses provided a window on their world and, in interesting ways, their metacognitive thoughts rounded out my efforts to metacognitively assess my course. As faculty, we are often reminded to step back and reflect on our goals for our class and for student learning. We need to consider what is working in our course and what is not working. It was finally clear to me that a full metacognitive consideration of my course required regular reflective feedback from my students and an understanding of what they were struggling with. Although I had always solicited such feedback, students seemed much more likely to be thinking about their learning and willing to share their assessment of that learning in an email just before an exam. Ultimately I now see their honest metacognitive feedback has meant that I have gained as much or more than the students I was initially trying to help.

Connecting with students can improve student performance Share on X

Fighting Imposter Syndrome Through Metacognition

By Charity S. Peak, Ph.D.

Have you ever felt like an imposter at work? Taught a class that was not your expertise? Felt intimidated before giving a presentation? Nearly every faculty member experiences this imposter phenomenon at some point. After all, as faculty we work around incredibly smart and talented people who shine from being experts in their field. Additionally, people drawn to academia naturally feel compelled to be knowledgeable and often find themselves to be inadequate when they are not (Huston, 2009).

Imposter syndrome is “an overwhelming sense of being a fraud, a phony, of not being good enough for [a] job, despite much evidence to the contrary” (Kaplan, 2009). Apart from accomplishing significant professional milestones, people cannot seem to internally acknowledge their success or feel deserving. This sense of being an imposter is prevalent among women but is increasingly being revealed by men as well. Although the condition is often referred to as a syndrome, it is important to understand that it is NOT actually a diagnosable mental illness found in the DSM-V. Instead, it is an affliction, similar to test or performance anxiety, experienced by a variety of high-achieving individuals that can be treated successfully using metacognition and self-regulation.

Reactions to imposter syndrome vary widely and by individual. Typically, imposter phenomenon starts with a self-sabotaging internal dialogue, such as:

  • Who do I think I am? I’m not smart enough to teach this class or present on this topic.
  • What if my students ask me a question that I can’t answer?
  • What if someone finds out I don’t know what I’m talking about?
  • I’m not cut out for this. I really can’t do this.

A physical reaction similar to other stressful situations (fight, flight, or freeze) often follows:

  • Increased blood pressure
  • Blushing
  • Sweating
  • Shaking
  • Tonic immobility (i.e., mental block or “deer in headlights”)

Faculty in these situations tend to respond in one of two ways:

  • Undercompensating by becoming submissive, overly agreeable or even apologetic
  • Overcompensating with defensive, bossy and aggressive behaviors
  1. Recognize symptoms when they arise and recenter yourself through breathing:
  • Assume a comfortable posture
  • Close your eyes if possible
  • Focus on the sensations of your body
  • Breathe in through your nose and out through your mouth
  • When your mind wanders, gently bring it back to your breath
  • Breathe in, breathe out
  • Repeat for at least 10 breaths and up to 5 minutes
  1. Reconstruct a new, positive internal dialogue. Talk to yourself as you would a good friend by being supportive and confidence-building.
  2. Posture yourself as confident. It turns out that “fake it till you make it” works with regard to physical posture. People who use Power Poses for 2 minutes demonstrate higher levels of confidence-building hormones (testosterone) as opposed to stress-inducing hormones (cortisol) (Carney, Cuddy & Yap, 2010; Cuddy, 2012).
  3. Acknowledge the limits of your knowledge. Instead of hiding your lack of expertise, build a repertoire of ways to deflect difficult questions, such as:
  • What do you think?
  • I don’t know. Does anyone want to look it up and tell us the answer?
  • Great question. Can we talk about that more after class (or meeting)?
  • Let’s not dive too deeply into that issue because it might distract us from today’s agenda.
  • Good thought. Does anyone want to collaborate to address that concern?
  • Here is what I know, and here is what I don’t know (Huston, 2009).
  1. Avoid “teaching as telling.” Rather than lecturing, which requires great preparation and pressure to be the expert in the room, move toward new pedagogical models of facilitation which turn the teaching burden over to the students, such as jigsaw and gallery walk.
  2. Know that you are not alone. It is plausible that nearly everyone in the room has felt this way at one point or another in their careers, even though they may not readily share these thoughts with others. Normalizing the feelings to yourself will start to defuse your anxiety.
  3. Share the issue with others you trust. A mentor or even a small community of colleagues can collaboratively strategize about how to address the issue.
  4. Recognize external factors that might contribute. Often people blame themselves for toxic situations which were created by outside circumstances. If the situation persists, consider declining future involvement to avoid setting yourself up for difficulties.

“Awareness is half the battle” really does apply to imposter syndrome. Through metacognition, you can conquer the self-defeating thoughts and behaviors that might prevent you from succeeding in your personal and professional life. Intentional self-monitoring of negative internal dialogue followed by practicing self-regulation through the simple strategies outlined above is the antidote to imposter syndrome. So next time you feel yourself break into a sweat (figuratively or literally), assume a Power Pose and leverage metacognition to triumph over your doubts!

Metacognition promotes success by helping us overcome self-defeating thoughts. Share on X
 Resources:

Carney, D. R., Cuddy, A. J., & Yap, A. J. (2010). Power posing brief nonverbal displays affect neuroendocrine levels and risk tolerance. Psychological Science, 21(10), 1363-1368. doi: 10.1177/0956797610383437

Cuddy, A. (2012, October 1). Your body language shapes who you are [Video file]. Retrieved from https://www.ted.com/talks/amy_cuddy_your_body_language_shapes_who_you_are?language=en

Huston, T. (2009). Teaching What You Don’t Know. Cambridge, MA: Harvard University Press.

Kaplan, K. (2009). Unmasking the impostor. Nature, 459(21): 468-469. doi: 10.1038/nj7245-468a


A Minute a Day Keeps the Metacognitive Doctor Away!

Aaron S. Richmond

Metropolitan State University of Denver

First and foremost, what I am about to discuss with you all is not an educational or metacognitive teaching panacea (aka silver-bullet). But I would like introduce and discuss is the idea of using Classroom Assessment Techniques (affectionately known as CATs) as a form of a metacognitive instructional strategy.

CATs: A Very Brief Review

Described best by Angelo and Cross (1993), CATs are designed to serve two purposes. First, they are meant as a formative assessment tool for teachers to understand how much their students are learning in the course. Second, CATs are designed to provide you, the teacher, feedback on the efficacy of your teaching strategies/methods. CATs are typically very brief and take very little instructional time (a minute or two).  CATs are also created based on your assessment needs. For instance, if you are interested in assessing course-related knowledge and skills then you might want to use the one-minute paper, focused listening, background knowledge probe (see Cunningham & Moore, n.d.). Or, if you are interested in assessing skill in analysis and critical thinking, you may want to use pro and con grids, or analytic memos, or content, form, and function outlines (see Cunningham & Moore, n.d.). If you would like to assess your students’ skill in synthesis and creative thinking you may want to use one-sentence summary, or concept maps, or approximate analogies. The list of different types of CATs goes on and on (see Cunningham & Moore, n.d. for complete list and summary) so I would like to focus on previously established CATs that lend themselves to be quite quick, easy, and potentially effective metacognitive improvement tools. I like to call these the Metacognitive Cats or MCATs!

The MCATs

Cunningham and Moore (n.d.) recently categorized 50 of Angelo and Cross’ (1993) CATs based on the purpose of the assessment needed (some described previously in the blog). Among these categories, Cunningham and Moore posit that some CATs are meant for “Techniques for Assessing Learner Attitudes, Values, and Self-Awareness” (p. 4). Several of the CATs in this category lend themselves to be almost metacognitive awareness activities. Specifically, these include course-related self-confidence surveys, focused autobiographical sketches, muddiest point, productivity study time log, and diagnostic learning logs. Let me take a moment to describe these potential MCATs (Angelo & Cross, 1993).

  • Course-related self-confidence surveys: At the end of class you have students fill out an anonymous questionnaire that assesses their confidence in mastering the material discussed in class.  
  • Focused autobiographical sketches: At the end or beginning of class, have students write a brief statement on a “successful” study strategy or learning strategy that they used to learn the class material.
  • Muddiest point: At the conclusion of a lesson, ask students to write down the one concept that they are still struggling with in one or two sentences. You can use this to identify which concepts students are struggling with.
  • Productivity study time log: Have students keep a daily log and record both the amount of time spent studying and the quality of time spent studying for your course. Students can complete this before, or at the beginning or end of class.
  • Diagnostic learning logs: Have students write a log for assignments or assessments in which the student identifies what study methods and knowledge that they had correct and have them diagnose what they did not have correct and how to solve this error for the future. These can be done before, during or after class as well.

Now, these MCATs are just CATs unless you help students connect the CAT to the Metacognition. The trick is, how do you do this? One answer may be direct feedback and reflection to the learner. What I mean, is that if you employ a CAT (e.g., muddiest point), then you need to make it metacognitive by providing feedback directly to your student on their performance, have students elaborate and reflect on their answers, and provide constructive solutions/assistance in improving their metacognition.  Let me illustrate using the MCAT of a muddiest point. After your students turn in their muddiest point, take a few minutes to talk to the students about why they are confused about the content. You may ask your student about their note-taking strategies in class. Or you may ask your student about their reading strategies when they read the chapter before class. You may ask them about their attention to the lesson (i.e., the amount of cell phone or computer use). You may ask them about their use of other study strategies. Then, have your student reflect on why they didn’t understand the course material based on your conversation and have them come up with changing just one thing about how they studied. The next time, you repeat the MCAT muddiest point, the process can start over and you can revisit the same questions with your students. Incorporating direct feedback, reflection, and solution of CATs may just turn them into MCATs.

Concluding Questions

As, to my knowledge, educational and metacognitive researchers have not investigated the efficacy of these potential MCATs as metacognitive instructional tools. Therefore, I feel like I must wrap up this blog with a few questions/challenges/inspirational ideasJ

  1. Can the use of MCATs increase metacognitive awareness in students?
  2. Can the use of MCATs increase metacognitive knowledge in students?
  3. Can the use of MCATs increase academic performance of students?
  4. If the answer to any of the previous questions is yes, then the questions becomes are some MCATs better than others and can students transfer the use of these MCATs to other content domain?

References

Angelo, T. A., & Cross, K. P. (1993). Classroom assessment techniques: A handbook for college teachers (2nd ed.). San Francisco, CA: Jossey-Bass.

Cunningham, K., & Moore, D. (n.d.). 50 CATS by Angelo and Cross. Retrieved from http://pages.uoregon.edu/tep/resou


Using Metacognition to Make International Connections

by Lauren Scharff, PhD, U. S. Air Force Academy and John Draeger, PhD, SUNY Buffalo State

If you’re one of our longer-term followers, you’ll notice that this post is a bit different from others on our site. We just wrapped up a fantastic week in Melbourne, Australia working with six colleagues from around the globe, and we want to share some of our metacognition endeavors and reflections with you. This experience was part of the second International Collaborative Writing Groups  (ICWG) that is an affiliate effort for the International Society for the Scholarship of Teaching and Learning (ISSoTL).

Eight groups were part of the ICWG. The groups formed in May and met virtually over the summer to focus their topics and develop an outline prior to the face-to-face meeting this past week. Our group’s topic was The Student Learning Process, and we focused our efforts on how metacognition would support the transfer of learning from one situation or context to another. We believe the transfer of learning is one of the ultimate goals of education because it supports lifelong learning and employability.

The group’s work on how metacognition supports the transfer of learning will be revealed when it’s published, but meanwhile, we will share some ways that metacognition was part of our experience of facilitating the group. We’ll start with some pictures to set the tone. The first shows our group working: from left to right, Lauren, Susan Smith (Leeds Beckett University, UK), Lucie S Dvorakova (Honors Student, University of Queensland, Australia), Marion Tower (University of Queensland), Dominic Verpoorten (IFRES-University of Liège, Belgium), Marie Devlin (Newcastle University, UK), and Jason M. Lodge (University of Melbourne, Australia), [John Draeger taking the pic]. The second gives you a sense of the overall setting, showing multiple groups all kept to task by savvy ICWG coordinators, Mick Healy (University of Gloucestershire, retired) and Kelly Matthews (University of Queensland). Fortunately, Mick and Kelly also built in some social time for community building. The third picture shows our group at the Victoria State Library, left to right: Dominique, Sam, Marion, Sue, Marion, John, Lauren and Jason.

ICWG_SLP_Working

ICWG_mult_groups

ICWG_SLP_Social

How Metacognition Found Its Way into Our Facilitating Experiences

If you read the home page of this site, you’ll notice that we loosely define metacognition as the intertwined awareness and self-regulation of a process/skill, specifically with the goal of developing that process or skill. Although the site is focused on metacognition as it relates to teaching and learning, it can refer to any skill or process. Facilitating a group can be much like teaching, but it involves some additional processes that might more traditionally be linked to leadership and communication.

We noticed ourselves using metacognition in the following aspects of our work:

Use of Language: Given the international character of the group, self-monitoring and self-regulation allowed us to navigate differences in language and underlying assumptions. For example, through our discussions we learned that academic faculty might be referred to as ‘staff,’ ‘tutor,’ ‘instructor’ or ‘professor.’ Individual courses might be referred to as ‘classes,’ ‘modules’ or ‘units’ of study.

Assumptions about education: Our discussion revealed differences in the structures of the university systems in different countries. When discussing how students might use their learning in one course to inform their learning in another, the two North Americans on the team (John and Lauren) tended to think about transfer learning between a diverse set of courses across a broad liberal arts core curriculum in addition to transfer across more closely related courses within a major. Because undergraduate education in Australia and the United Kingdom tend not to be structured around a broad core curriculum, members of the team from these countries tended to focus on transfer learning within a particular field of study.

As we drafted our text and created a survey that was to be used in four different countries, we each engaged in self-monitoring of the terms as the conversation was in progress and would regulate behavior accordingly. For example, someone would start by saying “I think that staff might…” but then quickly add “or perhaps you might say ‘professors.’” Similarly, we would use our newly developed awareness of the different educational structures to guide our discussion about how transfer of learning might be supported across all of our learning environments.

Management of Project Scope: Both transfer of learning and metacognition are vast areas of study. Given the wide variety of experiences and individual interests in our group, we explored a wide array of possible directions for our paper, some of which we decided we would table for follow-on papers (e.g. how student level of intellectual development might impact transfer of learning and the creation of a “toolkit” for instructors that would help them support transfer of learning). Moving the conversation in fruitful directions required that all of us remain mindful of the task at hand (i.e. working towards a 6000-word article). Self-monitoring allowed us to detect when an interesting discussion had gone beyond the scope of our current article and self-regulation more quickly brought us back to the task at hand.

In summary, the international character of the writing group added a depth and richness to the conversation, but it also increased the likelihood of misunderstanding and the challenge of group management. Self-monitoring and self-regulation allowed us to overcome those challenges.

Many thanks to our group members for a fantastic face-to-face experience, and we look forward to our continued exchanges as we finalize the paper and carry on with the follow-on papers.


Two forms of ‘thinking about thinking’: metacognition and critical thinking

by John Draeger (SUNY Buffalo State)

In previous posts, I have explored the conceptual nature of metacognition and shared my attempts to integrate metacognitive practices into my philosophy courses. I am also involved in a campuswide initiative that seeks to infuse critical thinking throughout undergraduate curricula. In my work on both metacognition and critical thinking, I often find myself using ‘thinking about thinking’ as a quick shorthand for both. And yet, I believe metacognition and critical thinking are distinct notions. This post will begin to sort out some differences.

My general view is that the phrase ‘thinking about thinking’ can be the opening move in a conversation about either metacognition or critical thinking. Lauren Scharff and I, for example, took this tack when we explored ways of unpacking what we mean by ‘metacognition’ (Scharff & Draeger, 2014). We considered forms of awareness, intentionality, and the importance of understanding of various processes. More specifically, metacognition encourages us to monitor the efficacy of our learning strategies (e.g., self-monitoring) and prompts us to use that understanding to guide our subsequent practice (e.g., self-regulation). It is a form of thinking about thinking. We need to think about how we think about our learning strategies and how to use our thinking about their efficacy to think through how we should proceed. In later posts, we have continued to refine a more robust conception of metacognition (e.g., Scharff 2015, Draeger 2015), but ‘thinking about thinking’ was a good place to start.

Likewise, the phrase ‘thinking about thinking’ can be the opening move in conversations about critical thinking. Given the wide range of program offerings on my campus, defining ‘critical thinking’ has been a challenge. Critical thinking is a collection of skills that can vary across academic settings and how these skills are utilized often requires disciplinary knowledge. For example, students capable of analyzing how factors such as gender, race, and sexuality influence governmental policy may have difficulty analyzing a theatrical performance or understanding the appropriateness of a statistical sampling method. Moreover, it isn’t obvious how the skills learned in one course will translate to the course down the hall. Consequently, students need to develop a variety of critical thinking skills in a variety of learning environments. As we began to consider how to infuse critical thinking across the curriculum, the phrase ‘thinking about thinking’ was something that most everyone on my campus could agree upon. It has been a place to start as we move on to discuss what critical thinking looks like in various domains of inquiry (e.g., what it means to think like an artist, biologist, chemist, dancer, engineer, historian, or psychologist).

‘Thinking about thinking’ captures the idea students need to think about the kind of thinking skills that they are trying to master, and teachers need to be explicit about those skills that if their students will have any hope of learning them. This applies to both metacognition and critical thinking. For example, many students are able to solve complex problems, craft meaningful prose, and create beautiful works of art without understanding precisely how they did it. Such students might be excellent thinkers, but unless they are aware of how they did what they did, it is also possible that they got just lucky. Both critical thinking and metacognition help ensure that students can reliably achieve desired learning outcomes. Both require practice and both require the explicit awareness of the relevant processes. More specifically, however, critical thinkers are aware of what they are trying to do (e.g., what it means to think like an artist, biologist, chemist, dancer, engineer, historian, psychologist), while metacognitive thinkers are aware of whether their particular strategies are effective (e.g., whether someone is an effective artist, biologist, chemist, dancer, engineer, historian, psychologist). Critical thinking and metacognition, therefore, differ in the object of awareness. Critical thinking involves an awareness of mode of thinking within a domain (e.g., question assumptions about gender, determine the appropriateness of a statistical method), while metacognition involves an awareness of the efficacy of particular strategies for completing that task.

‘Thinking about thinking’ is a good way to spark conversation with our colleagues and our students about a number of important skills, including metacognition and critical thinking. In particular, it is worth asking ourselves (and relaying to our students) what it might mean for someone to think like an artist or a zoologist (critical thinking) and how we would know whether that artist or zoologist was thinking effectively (metacognition). As these conversations move forward, we should also think through the implications for our courses and programs of study. How might this ongoing conversation change course design or methods of instruction? What might it tell us about the connections between courses across our campuses? ‘Thinking about thinking’ is a great place to start such conversations, but we must remember that it is only the beginning.

References

Draeger, John (2015). “Exploring the relationship between awareness, self-regulation, and metacognition.” Retrieved from https://www.improvewithmetacognition.com/exploring-the-relationship-between-awareness-self-regulation-and-metacognition/

Scharff, Lauren & Draeger, John (2014). “What do we mean when we say “Improve with metacognition”? (Part One) Retrieved from https://www.improvewithmetacognition.com/what-do-mean-when-we-say-improve-with-metacognition/

Scharff, Lauren (2015). “What do we mean by ‘metacognitive instruction?” Retrieved from https://www.improvewithmetacognition.com/what-do-we-mean-by-metacognitive-instruction/Thinking about two forms of thinking about thinking: Metacognition and critical thinking Share on X


Metacognitive Judgments of Knowing

Roman Taraban, Ph.D., Dmitrii Paniukov, John Schumacher, Michelle Kiser, at Texas Tech University

“The more you know, the more you know you don’t know.” Aristotle

Students often make judgments of learning (JOLs) when studying. Essentially, they make a judgment about future performance (e.g., a test) based on a self-assessment of their knowledge of studied items. Therefore, JOLs are considered metacognitive judgments. They are judgments about what the person knows, often related to some future purpose. Students’ accuracy in making these metacognitive judgments is academically important. If students make accurate JOLs, they will apply just the right amount of time to mastering academic materials. If students do not devote enough time to study, they will underperform on course assessments. If students spend more time than necessary, they are being inefficient.

As instructors, it would be helpful to know how accurate students are in making these decisions. There are several ways to measure the accuracy of JOLs. Here we will focus on one of these measures, termed calibration. Calibration is the difference between a student’s JOL related to some future assessment and his actual performance on that assessment. In the study we describe here, college students made JOLs (“On a scale of 0 to 100, what percent of the material do you think you can recall?”) after they read a brief expository text. Actual recall was measured in idea units (IUs) (Roediger & Karpicke, 2006). Idea units are the chunks of meaningful information in the text.   Calibration is here defined as JOL – Recalled IUs, or simply, predicted recall minus actual recall. If the calibration calculation leads to a positive number, you are overconfident to some degree; if the calculation result is negative, then you are underconfident to some degree. If the result is zero, then you are perfectly calibrated in your judgment.

The suggestion from Aristotle (see quote above) is that gains in how much we know lead us to underestimate how much we know, that is, we will be underconfident. Conversely, when we know little, we may overestimate how much we know, that is, we will be overconfident. Studies using JOLs have found that children are overconfident (predicted recall minus actual recall is positive) (Lipko, Dunlosky, & Merriman, 2009; Was, 2015). Children think they know more than they know, even after several learning trials with the material. Studies with adults have found an underconfidence with practice (UWP) effect (Koriat et al., 2002), that is, the more individuals learn, the more they underestimate their knowledge. The UWP effect is consistent with Aristotle’s suggestion. The question we ask here is ‘which is it’: If you lack knowledge, do your metacognitive judgments reflect overconfidence or underconfidence, and vice versa? Practically, as instructors, if students are poorly calibrated, what can we do to improve their calibration, that is, to recalibrate this metacognitive judgment.

We addressed this question with two groups of undergraduate students, as follows. Forty-three developmental-reading participants were recruited from developmental integrated reading and writing courses offered by the university, including Basic Literacy (n = 3), Developmental Literacy II (n = 29), and Developmental Literacy for Second Language Learners (n = 11). Fifty-two non-developmental participants were recruited from the Psychology Department subject pool. The non-developmental and developmental readers were comparable in mean age (18.3 and 19.8 years, respectively) and the number of completed college credits (11.8 and 16.7, respectively), and each sample represented roughly fifteen academic majors. All participants participated for course credit. The students were asked to read one of two expository passages and to recall as much as they could immediately. The two texts used for the study were each about 250 words in length and had an average Flesch-Kincaid readability score of 8.2 grade level. The passages contained 30 idea units each.

To answer our question, we first calculated calibration (predicted recall – actual recall) for each participant. Then we divided the total sample of 95 participants into quartiles, based on the number of idea units each participant recalled. The mean proportion of correct recalled idea units, out of 30 possible, and standard deviation in each quartile for the total sample were as follows:

Q1: .13 (.07); Q2: .33 (.05); Q3: .51 (.06); Q4: .73 (.09). Using quartile as the independent variable and calibration as the dependent variable, we found that participants were overconfident (predicted recall > actual recall) in all four quartiles. However, there was also a significant decline in overconfidence from Quartile 1 to Quartile 4 as follows: Q1: .51; Q2: .39; Q3: .29; Q4: .08. Very clearly, the participants in the highest quartile were nearly perfectly calibrated, that is, they were over-predicting their actual performance by only about 8%, compared to the lowest quartile, who were over-predicting by about 51%. This monotonic trend of reducing overconfidence and improving calibration was also true when we analyzed the two samples separately:

NON-DEVELOPMENTAL: Q1: .46; Q2: .39; Q3: .16; Q4: .10;

DEVELOPMENTAL: Q1: .57; Q2: .43; Q3: .39; Q4: .13.

The findings here suggest that Aristotle may have been wrong when he stated that “The more you know, the more you know you don’t know.” Our findings would suggest that the more you know, the more you know you know. That is, calibration gets better the more you know. What is striking here is the vulnerability of weaker learners to overconfidence. It is the learners who have not encoded a lot of information from reading that have an inflated notion of how much they can recall. This is not unlike the children in the Lipko et al. (2009) research mentioned earlier. It is also clear in our analyses that typical college students as well as developmental college students are susceptible to overestimating how much they know.

It is not clear from this study what variables underlie low recall performance. Low background knowledge, limited vocabulary, and difficulty with syntax, could all contribute to poor encoding of the information in the text and low subsequent recall. Nonetheless, our data do indicate that care should be taken in assisting students who fall into the lower performance quartiles to make better calibrated metacognitive judgments. One way to do this might be by asking students to explicitly make judgments about future performance and then encouraging them to reflect on the accuracy of those judgments after they complete the target task (e.g., a class test). Koriat et al. (1980) asked participants to give reasons for and against choosing responses to questions before the participants predicted the probability that they had chosen the correct answer. Prompting students to consider the amount and strength of the evidence for their responses reduced overconfidence. Metacognitive exercises like these may lead to better calibration.

References

Koriat, A., Lichtenstein, S., Fischoff, B. (1980). Reasons for confidence. Journal of Experimental Psychology: Human Learning and Memory, 6(2), 107-118.

Koriat, A., Sheffer, L., & Ma’ayan, H. (2002). Comparing objective and subjective learning curves: Judgments of learning exhibit increased underconfidence with practice. Journal of Experimental Psychology: General, 131, 147–162.

Lipko, A. R., Dunlosky, J., & Merriman, W. E. (2009). Persistent overconfidence despite practice: The role of task experience in preschoolers’ recall predictions. Journal of Experimental Child Psychology, 102(2), 152-166.

Roediger, H., & Karpicke, J. D. (2006). Test-enhanced learning: Taking memory tests improves long-term retention. Psychological Science, 17(3), 249-255.

Was, C. (2015). Some developmental trends in metacognition. Retrieved from

https://www.improvewithmetacognition.com/some-developmental-trends-in-metacognition/.

 


Pausing Mid-Stride: Mining Metacognitive Interruptions In the Classroom

By Amy Ratto Parks, Ph.d., University of Montana

Metacognitive interventions are often the subject of research in educational psychology because researchers are curious about how these planned, curricular changes might impact the development of metacognitive skills over time. However, as a researcher in the fields of metacognition and rhetoric and composition, I am sometimes struck by the fact that the planned nature of empirical research makes it difficult for us to take advantage of important kairic moments in learning.

The rhetorical term kairic, taken from the Greek concept of kairos, generally represents a fortuitous window in time in which to take action toward a purpose. In terms of learning, kairic moments are those perfect little slivers in which we might suddenly gain insight into our own or our students’ learning. In the classroom, I like to think of these kairic moments as metacognitive interruptions rather than interventions because they aren’t planned ahead of time. Instead, the “interruptions” arise out of the authentic context of learning. Metacognitive interruptions are kairic moments in which we, as teachers, might be able to briefly access a point in which the student’s metacognitive strategies have either served or not served them well.

A few days ago I experienced a very typical teaching moment that turned out to be an excellent example of a fruitful metacognitive interruption: I asked the students to take out their homework and the moment I began asking discussion questions rooted in the assignment, I sensed that something was off. I saw them looking at each other’s papers and whispering across the tables, so I asked what was going on. One brave student said, “I think a bunch of us did the homework wrong.”

They were supposed to have completed a short analysis of a peer-reviewed article titled, “The Daily Show Effect: Candidate Evaluations, Efficacy, and American Youth” (Baumgartner & Morris, 2014). I got out the assignment sheet and asked the brave student, Rasa*, to read it aloud. She said, “For Tuesday, September 15. Read The Daily Show Effect: Candidate Evaluations…. oh wait. I see what happened. I read the other Jon Stewart piece in the book.” Another student jumped in and said, “I just analyzed the whole show” and a third said, “I analyzed Jon Stewart.”

In that moment, I experienced two conflicting internal reactions. The teacher in me was annoyed. How could this simple set of directions have caused confusion? And how far was this confusion going to set us back? If only half of the class had done the work, the rest of my class plan was unlikely to go well. However, the researcher in me was fascinated. How, indeed, had this simple set of instructions caused confusion? All of these students had completed a homework assignment, so they weren’t just trying to “get out of work.” Plus, they also seemed earnestly unsure about what had gone wrong.

The researcher in me won out. I decided to let the class plan go and I began to dig into the situation. By a show of hands I saw that 12 of the 22 students had done the correct assignment and 10 had completed some customized, new version of the homework. I asked them all to pause for a moment and engage in a metacognitive activity: they were to think back to moment they read the assignment and ask themselves, where did I get mixed up?

Rasa said that she just remembered me saying something about The Daily Show in class, and when she looked in the table of contents, she saw a different article, “Political Satire and Postmodern Irony in the Age of Stephen Colbert and Jon Stewart” (Colletta, 2014), and read it instead. Other students said that they must not have read closely enough, but then another student said something interesting. She said, “I did read the correct essay, but it sounded like it was going to be too hard to analyze and I figured that you hadn’t meant for this to be so hard, so I just analyzed the show.” Other students nodded in agreement. I asked the group to raise their hands if had read the correct essay. Many hands went up. Then I asked if they thought that the analysis they chose to do was easier than the one I assigned. All of them raised their hands.

Again, I was fascinated. In this very short conversation I had just watched rich, theoretical research play out before me. First, here was an example of the direct effect of power browsing (Kandra, Harden, & Babbra, 2012) mistakenly employed in the academic classroom. Power browsing is a relatively recently coined term that describes “skimming and scanning through text, looking for key words, and jumping from source to source” (Kandra et al., 2012).  Power browsing can be a powerful overviewing strategy (Afflerbach & Cho, 2010) in an online reading environment where a wide variety of stimuli compete for the reader’s attention. Research shows that strong readers of non-electronic texts also employ pre-reading or skimming strategies (Dunlosky & Metcalfe, 2009), however, when readers mistakenly power browse in academic settings, it may result in “in missed opportunities or incomplete knowledge” (Kandra et al., 2012, par. 18). About metacognition and reading strategies, Afflerbach and Cho (2010) write, “the good strategy user is always aware of the context of reading” (p. 206); clearly, some of my students had forgotten their reading context. Some of the students knew immediately that they hadn’t thoroughly read the assignment. As soon as I described the term “power browse” their faces lit up. “Yes!” said, Rasa, “that’s exactly what I did!” Here was metacognition in action.

Second, as students described the reasoning behind choosing to read the assigned essay, but analyze something unassigned, I heard them offering a practical example of Flower and Hayes’ (1981/2011) discussion of goal-setting in the writing process. Flower and Hayes (1981/2011) said that writing includes, “not only the rhetorical situation and audience which prompts one to write, it also includes the writer’s own goals in writing” (p. 259). They went on to say that although some writers are able to “juggle all of these demands” others “frequently reduce this large set of restraints to a radically simplified problem” (p. 259). Flower and Hayes allow that this can sometimes cause problems, but they emphasize that “people only solve the problems they set for themselves” (p. 259).

Although I had previously seen many instances of students “simplifying” larger writing assignments in my classroom, I had never before had a chance to talk with students about what had happened in the moment when they realized something hadn’t worked. But here, they had just openly explained to me that the assignment had seemed too difficult, so they had recalibrated, or “simplified” it into something they thought they could do well and/or accomplish during their given timeframe.

This metacognitive interruption provided an opportunity to “catch” students in the moment when their learning strategies had gone awry, but my alertness to the kairic moment only came as a result of my own metacognitive skills: when it became clear that the students had not completed the work correctly, I paused before reacting and that pause allowed me to be alert to a possible metacognitive learning opportunity. When I began to reflect on this class period, I realized that my own alertness came as a result of my belief in the importance of teachers being metacognitive professionals so that we can interject learning into the moment of processing.

There is yet one more reason to mine these metacognitive interruptions: they provide authentic opportunities to teach students about metacognition and learning. The scene I described here could have had a very different outcome. It can be easy to see student behavior in a negative light. When students misunderstand something we thought we’d made clear, we sometimes make judgments about them being “lazy” or “careless” or “belligerent.” In this scenario it seems like it would have been justifiable to have gotten frustrated and lectured the students about slowing down, paying attention to details, and doing their homework correctly.

Instead, I was able to model the kind of cognitive work I would actually want to teach them: we slowed down and studied the mistake in a way that led the class to a conversation about how our minds work when we learn. Rather than including a seemingly-unrelated lecture on “metacognition in learning” I had a chance to teach them in response to a real moment of misplaced metacognitive strategy. Our 15-minute metacognitive interruption did not turn out to be a “delay” in the class plan, but an opening into a kind of learning that might sometimes just have to happen when the moment presents itself.

References

Baumgartner, J., & Morris, J., (2014). The Daily Show effect: Candidate evaluations, efficacy, and American youth. In C. Cucinella (Ed.), Funny. Southlake, Fountainhead Press. (Reprinted from American Politics Journal, 34(3), (2006), pp.341-67).

Colletta, L. (2014). Political satire and postmodern irony in the age of Stephen Colbert and Jon Stewart. In C. Cucinella (Ed.), Funny. Southlake, Fountainhead Press. (Reprinted from The Journal of Popular Culture, 42(5), (2009), pp. 856-74).

Dunlosky, J., & Metcalfe, J. (2009). Metacognition. Thousand Oaks, CA: Sage.

Flower, L., & Hayes, J. (2011). A cognitive process theory of writing. In V. Villanueva & K. Arola (Eds.), Cross-talk in comp theory: A reader, (3rd ed.), (pp. 253-277). Urbana, IL: NCTE. (Reprinted from College Composition and Communication, 32(4), (Dec., 1981), pp. 365-387).

Kandra, K. L., Harden, M., & Babbra, A. (2012). Power browsing: Empirical evidence at the college level. National Social Science Journal, 2, article 4. Retrieved from http://www.nssa.us/tech_journal/volume_2-2/vol2-2_article4.htm

Waters, H. S., & Schneider, W., (Eds.). (2010). Metacognition, strategy use, and instruction. New York, NY: The Guilford Press.

* Names have been changed to protect the students’ privacy.