Teaching Perspectives Inventory (TPI): The 5 Perspectives

There are a lot of free surveys/inventories “out there” for all sorts of things, most often related to some aspect of personality. If you use them in a reflective manner, they can help you better understand yourself – your . The TPI (also free) offers a chance for you to reflect on your teaching perspectives (one aspect of metacognitive instruction). The TPI suggests 5 perspectives: Transmission, Apprenticeship, Developmental, Nurturing, and Social Reform.

http://www.teachingperspectives.com/tpi/


The Teaching Learning Group at CSUN

Two years ago, eight faculty at California State University, Northridge, began studying how people learn as a grassroots effort to increase student success by focusing on what instructors do in the classroom. Our website shares our efforts, Five Gears for Activating Learning, as well as supporting resources and projects developed to date (e.g., documents, videos, and a yearlong Faculty Learning Community in progress). Although all five gears interact when people learn and develop expertise, our fifth gear, the Developing Mastery gear, focuses on assisting students in developing their metacognitive skills.

http://www.csun.edu/cielo/teaching-learning-group.html


The Six Hour D… And How to Avoid It

This great essay by Russ Dewey (1997) evolved from a handout he used to give his students. He shares some common examples of poor study strategies and explains why they are unlikely to lead to deep learning (even if they are used for 6 hours…). He then shares a simple metacognitive self-testing strategy that could be tailored for courses across the disciplines.

http://www.psywww.com/discuss/chap00/6hourd.htm


Despite Good Intentions, More is Not Always Better

by Lauren Scharff, U.S. Air Force Academy*

A recent post to the PSYCHTEACH listserv got me thinking about my own evolution as a teacher trying my best to help the almost inevitable small cluster of students who struggled in my courses, often despite claiming to “have studied for hours.” The post asked “Have any of you developed a handout on study tips/skills that you give to your students after the first exam?” A wide variety of responses were submitted, all of which reflected genuinely good intentions by the teachers.

However, based on my ongoing exploration of metacognition and human learning, I believe that, despite the good intentions, some of the recommendations will not consistently lead to the desired results. Importantly, these recommendations actually seem quite intuitive and reasonable on the surface, which leads to their appeal and continued use. Most of those that fall into this less ideal category do so because they imply that “More is Better.”

For example, one respondent shared, “I did correlations of their test scores with their attendance so far, the number of online quizzes they have taken so far, and the combined number of these two things. [All correlations were positive ranging from 0.35 to 0.57.] So I get to show them how their behaviors really are related to their scores…”

This approach suggests several things that all seem intuitively positive: online quizzes are a good way to study and attending class will help them learn. I love the empowerment of students by pointing out how their choice of behaviors can impact their learning! However, the message that more quizzes and simple attendance will lead to better grades does not capture the true complexity of learning.

Another respondent shared a pre-post quiz reflection assignment in which some of the questions asked about how much of the required reading was completed and how many hours were put into studying. Other questions asked about the use of chapter outcomes when reading and studying, the student’s expected grade on the quiz, and an open-ended question requesting a summary of study approaches.

This pre-post quiz approach seems positive for many reasons. Students are forced to think about and acknowledge levels and types of effort that they put into studying for the quizzes. There is a clear suggestion that using the learning outcomes to direct their studying would be a positive strategy. They are asked to predict their grades, which might help them link their studying efforts with predicted grades. These types of activities are actually good first steps at helping students become more metacognitive (aware and thoughtful) about their studying. Yea!

However, a theme running through the questions seems to be, again, “more is better.” More hours. More reading. The hidden danger is that students may not know how to effectively use the learning outcomes, how to read, how to effectively engage during class, how to best take advantage of practice quizzes to promote self-monitoring of learning, or what to do during those many hours of studying.

Thus, the recommended study strategies may work well for some students, but not all, due to differences in how students implement the strategies. Therefore, even a moderately high correlation between taking practice quizzes and exam performance might mask the fact that there are subgroups for which the results are less positive.

For example, Kontur and Terry (2013) found the following in a core Physics course, “On average, completing many homework problems correlated to better exam scores only for students with high physics aptitude. Low aptitude physics students had a negative correlation between exam performance and completing homework; the more homework problems they did, the worse their performance was on exams.”

I’m sure you’re all familiar with students who seem to go through “all the right motions” but who still struggle, become frustrated, and sometimes give up or develop self-doubt about their abilities. Telling students to do more of what they’re already doing if it’s not effective will actually be more harmful.

This is where many teachers feel uncomfortable because they are clearly working outside their disciplines. Teaching students how to read or how to effectively take notes in class, or how to self-monitor their own learning and adjust study strategies to different types of learning expectations is not their area of expertise. Most teachers somehow figured out how to do these things well on their own, or they wouldn’t be teachers now. However, they may never have thought about the underlying processes of what they do when they read or study that allowed them to be successful. They also feel pressures to cover the disciplinary content and focus on the actual course material rather than learning skills. Unfortunately, covering material does little good if the students forget most of the content anyway. Teaching them skills (e.g., metacognitive study habits) offers the prospect of retaining more of the disciplinary content that is covered.

The good news is that there are more and more resources available for both teachers and students (check out the resources on this website). A couple great resources specifically mentioned by the listserv respondents are the How to Get the Most out of Studying videos by Stephen Chew at Samford University and the short reading (great to share with both faculty and students) called The Six Hour D… and How to Avoid it by Dewey (1997). Both of these highlighted resources focus on metacognitive learning strategies.

This reflection on the different recommendations is not meant to belittle the well-intentioned teachers. However, by openly discussing these common suggestions, and linking to what we know of metacognition, I believe we can increase their positive impact. Share your thoughts, favorite study suggestions and metacognitive activities by using the comments link below, or submitting them under the Teaching Strategies tab on this website.

References

Dewey, R. (1997, February 12) The “6 hour D” and how to avoid it. [Online]. Available: http://www.psywww.com/discuss/chap00/6hourd.htm.

Kontur, F. & Terry, N. The benefits of completing homework for students with different aptitudes in an introductory physics course. Cornell Physics Library Physics Education. arXiv:1305.2213

 

* Disclaimer: The views expressed in this document are those of the authors and do not reflect the official policy or position of the U. S. Air Force, Department of Defense, or the U. S. Govt.


Negotiating Chaos: Metacognition in the First-Year Writing Classroom

by Amy Ratto Parks, Composition Coordinator/Interim Director of Composition, University of Montana

“Life moves pretty fast. If you don’t stop and look around once in a while, you could miss it.” John Hughes, Ferris Bueller’s Day Off

Although the movie Ferris Bueller’s Day Off (Hughes, 1986) debuted long before our current first-year college students were born, the combined sentiment of the film remains relevant to them. If we combined Ferris’ sense of exuberant freedom with Cameron’s grave awareness of personal responsibility, and added Sloane’s blasé ennui we might see an accurate portrait of a typical first-year student’s internal landscape. Many of our students are thrilled to have broken out of the confines of high school but are worried about not being able to succeed in college, so they arrive in our classrooms slumped over their phones or behind computer screens, trying to seem coolly disengaged.

The life of the traditional first-year student is rife with negotiations against chaos. Even if we remove the non-academic adjustments of living away from home, their lives are full of confusion. All students, even the most successful, will likely find their learning identities challenged: what if all of their previous academic problem-solving strategies are inadequate for the new set of college-level tasks?

In the first-year writing classroom, we see vivid examples of this adjustment period play out every year. Metacognitive activities like critical reflective writing help students orient themselves because they require students to pause, assess the task at hand, and assess their strategies for meeting the demands of the task. Writing studies researchers know that reflection benefits writers (Yancey, 1998) and portfolio assessment, common in first-year program across the country, emphasizes reflection as a major component of the course (Reynolds & Rice, 2006). In addition, outcomes written by influential educational bodies such as National Council of Teacher’s of English (ncte.org), The Common Core State Standards Initiative (corestandards.org), and Council of Writing Program Administrators (wpacouncil.org) emphasize the importance of metacognitive skills and demonstrate a shared belief in its importance.

But students aren’t necessarily on board. It is the rare student who has engaged in critical reflection in the academic setting. Instead, many aren’t sure how to handle it. Is it busy work from the teacher? Are they supposed to reveal their deep, inner feelings or is it a cursory overview? Is it going to be graded? What if they give a “wrong” reflection? And, according to one group of students I had, “isn’t this, like, for junior high kids?” In this last question we again see the developing learner identity. The students were essentially wondering, “does this reflective work make us little kids or grown ups?”

If we want new college students to engage in the kind of reflective work that will help them develop transferable metacognitive skills, we need to be thoughtful about how we integrate it into the coursework. Intentionality is important because there are a number of ways teachers might accidentally perpetuate these student mindsets. In order to get the most from reflective activities in class, keep the following ideas in mind:

  1. Talk openly with students about metacognition. If we want students to become aware of their learning, then the first thing to do is draw their attention to it. We should explain to students why they might care about metacognitive skills, as well as the benefits of investing themselves in the work. If we explain that reflection is one kind of metacognitive activity that helps us retrieve, sort, and choose problem-solving strategies, then reflection ceases to be “junior high” work and instead becomes a scholarly, collegiate behavior.
  2. Design very specific reflective prompts. When in doubt, err on the side of more structure. Questions like “what did you think about the writing assignment” seem like they would open the door to many responses; actually they allow students to answer without critically examining their writing or research decisions. Instead, design prompts that require students to critically consider their work. For example, “Describe one writing choice you made in this essay. What was the impact of your decision?”
  3. Integrate reflection throughout the semester. Ask students to reflect mid-way through the processes of drafting, research, and writing. If we wait until they finish an essay they learn that reflection is simply a concluding activity. If they reflect mid-process they become aware of their ability to assess and revise their strategies more than once. Also, reflection is a metacognitive habit of mind (Tarricone, 2011; Yancey, 1998) and habits only come to us through repeated activity.

These three strategies are a very basic beginning to integrating metacognitive activities into a curriculum. Not only do they help students evaluate the effectiveness of their attempts at problem solving, but they can also direct the students’ attention toward the strategies they’ve already brought to the class, thereby creating a sense of control over their learning. In the first-year writing classroom, where students are distracted and worried about life circumstance and learner identity, the sense of control gained from metacognitive work is especially important.

 

References

Chinich, M. (Producer), & Hughes, J.H. (Director). (1986). Ferris Beuller’s day off.[Motion picture]. USA: Paramount Pictures.

Reynolds, N., & Rice, R. (2006). Portfolio teaching: A guide to instructors. Boston, MA: Bedford St, Martin’s.

Tarricone, P. (2011). The taxonomy of metacognition. New York: Psychology Press.

Yancey, K.B. (1998). Reflection in the writing classroom. Logan: Utah State University Press.

(2013). First-year writing: What good does it do? Retrieved from http://www.ncte.org/library/nctefiles/resources/journals/cc/0232-nov2013/cc0232policy.pdf

(2014). Frameworks for success in postsecondary writing. Retrieved from http://wpacouncil.org/framework

(2014). English language arts standards. Retrieved from http://www.corestandards.org/ELA-Literacy/introduction/key-design-consideration/


Comprehension Monitoring: The Role of Conditional Knowledge

By Antonio Gutierrez, Georgia Southern University

In my previous post, Metacognitive Strategies: Are They Trainable?, I explored the extent to which metacognitive strategies are teachable. In my own research on how well students monitor their comprehension during learning episodes, I discovered that students reported already having a repertoire of metacognitive strategies. Yet, I have often found, in my own teaching and interaction with undergraduate and even graduate students, that having metacognitive declarative knowledge of strategies is often not sufficient to promote students’ comprehension monitoring. For instance, students may already know to draw a diagram when they are attempting to learn new concepts. However, they may not know under which circumstances it is best to apply such a strategy. When students do not know when, where and why to apply a strategy, they may in fact be needlessly expending cognitive resources for little, to no, benefit with respect to learning.

Schraw and Dennison (1994) argued that metacognition is divided in to knowledge and regulation components. Knowledge is comprised of declarative knowledge about strategies, procedural knowledge of how to apply them, and conditional knowledge about when, where, and why to apply strategies given task demands. The more that I engage students, inside and beyond my classes, the more that I become convinced that the greatest lack in metacognitive knowledge lies not in declarative or procedural knowledge, but in conditional knowledge. Students clearly have a repository of strategies and procedures to apply them. However, they seem incapable of applying those strategies effectively given the demands of the learning tasks in which they engage. So how can we enhance students’ conditional knowledge? Let’s assume that Sally is attempting to learn the concept of natural selection in her biology lesson. As Sally attempts to connect what she is learning with prior knowledge in long-term memory, she realizes she may have misconceptions regarding natural selection. She also understands that she has a variety of strategies to assist her in navigating this difficult concept. However, she does not know or understand which strategy will optimize her learning of the concept. Thus, she resorts to a trial-and-error utilization of the strategies she thinks are “best” to help her. Here we see a clear example of lack of adequate conditional knowledge. Much time and cognitive effort can be saved if we enhance students’ conditional knowledge. Calibration (the relationship between task performance and a judgment about that performance; Boekaerts & Rozendaal, 2010; Keren, 1991), a related metacognitive process, but distinct from conditional knowledge, involves the comprehension monitoring element of metacognitive regulation. As I continue my scholarship to deepen my understanding of calibration, I wonder whether conditional knowledge and calibration are more closely associated than researchers assume.

In my recent research on calibration I have often asked why the body of literature on calibration is inconclusive in its findings with respect to the effects of metacognitive strategy training on calibration. For instance, some studies have found positive effects for calibration (e.g., Gutierrez & Schraw, in press; Nietfeld & Schraw, 2002) while others have demonstrated no effect for strategy training on calibration (e.g., Bol et al., 2005; Hacker et al., 2008). This inconclusive evidence has frustrated me not only as a scholar but as a teacher as well. I suspect that these mixed findings in the literature on calibration may be due in part because researchers on calibration have neglected to address participants’ metacognitive conditional knowledge. How can we possibly hope as instructors to improve students’ comprehension monitoring when the findings on the role of metacognitive strategy instruction plays on calibration are inconclusive? So, perhaps as researchers/scholars of metacognition we are asking the wrong questions? I argue that by improving students’ metacognitive conditional knowledge, we can improve their ability to more effectively determine what they know and what they do not know about their learning (i.e., better calibrate their performance judgments to their actual performance). If students cannot effectively apply strategies given the demands of the learning episode (a conditional knowledge issue) how can we expect them to adequately monitor their comprehension (a regulation of learning issue)? Perhaps the next line of inquiry should exclusively focus on the enhancement of students’ conditional knowledge?

 

References

Boekaerts, M., & Rozendaal, J. S. (2010). Using multiple calibration measures in order to capture the complex picture of what affects students’ accuracy of feeling of confidence. Learning and Instruction, 20(4), 372-382. doi:10.1016/j.learninstruc.2009.03.002

Bol, L., Hacker, D. J., O’Shea, P., & Allen, D. (2005). The influence of overt practice, achievement level, and explanatory style on calibration accuracy, and performance. The Journal of Experimental Education, 73, 269-290.

Gutierrez, A. P., & Schraw, G. (in press). Effects of strategy training and incentives on students’ performance, confidence, and calibration. The Journal of Experimental Education: Learning, Instruction, and Cognition.

Hacker, D. J., Bol, L., & Bahbahani, K. (2008). Explaining calibration accuracy in classroom contexts: The effects of incentives, reflection, and explanatory style. Metacognition Learning, 3, 101-121.

Keren, G. (1991). Calibration and probability judgments: Conceptual and methodological issues. Acta Psychologica, 77(2), 217- 273. http://dx.doi.org/10.1016/0001-6918(91)90036-Y

Nietfeld, J. L., & Schraw, G. (2002). The effect of knowledge and strategy explanation on monitoring accuracy. Journal of Educational Research, 95, 131-142.


Everyday Metacognition

by Craig Nelson, Indiana University

This is my first post for this group. I have two goals. I want to illustrate some ways that we can use metacognition in everyday, non-academic situations. And I want to begin my posts with some reflections that are naïve. Naïve in the sense that I have not digested the work already on the blog and have not turned to the metacognitive literature. This will help me recognize later when other material really challenges my own ideas. We know from work in other contexts that making initial ideas explicit rather than tacit greatly facilitates cognitive change. In the present context, we could say that it activates metacognitive processing tools.

As always, a concrete example will make this clearer. Crouch and her associates (2004) asked “Classroom Demonstrations: Learning Tools or Entertainment?” They found that just doing a demonstration in physics had little effect on students’ understanding. It was simply entertainment. If you had a relevant misconception before the demonstration, you would probably still have it afterwards. Telling someone what is wrong with their misconception or even having them listen to a carefully constructed lecture or read a carefully constructed text is “futile,” their ideas are unlikely to change (Arons, 1976). Crouch et al. tried an alternative method. Before presenting the demonstration they asked students to write down what they thought would happen and then discuss their predictions briefly with their neighbors (thus activating prior conceptions and some relevant cognitive and metacognitive frameworks). Crouch and her associates then presented the demonstration and asked the students to compare what happened to their own predictions and to discuss the comparison with their neighbors. This led to significant conceptual change.

One of the most powerful metacognitive tools is exactly this. Ask yourself what a speaker or article or book or demonstration in class or real life is likely to say or show. Make explicit predictions. Whenever possible write them down. Then monitor the extent to which your predictions work out. Congratulate yourself when you are right and ask why you were wrong when they don’t.

A second powerful general metacognitive tool is related. Ask yourself how you do something. Then ask yourself what are some alternative ways you might do it and how you might decide which one to use in the future. For example: What pattern do you follow when you shop in the grocery store? Do you start with produce or end with produce? What else? How else might you systematically shop? And now the tough part: What criteria might favor each of the patterns? For example: Ending with heavy things such as dog-food might minimize your pushing effort but it might also risk crushing more delicate items. I try to minimize the temptation to buy junk food and processed carbohydrates generally. This means that in the grocery stores I visit, I generally shop the margins (produce, meat, diary) and avoid going through the aisles with canned food, sweets, chips and related foods unless I have something on my list that is found there. The bottom line is that for the things we do and the ways that we think, we should remember to ask, first, what are the alternatives and, second, what do we gain and lose by the ones that we choose.

I will close with two foreshadowings of points I expect to develop much more extensively later. Learning to think is a strange enterprise. Our best thinking at each point has limits that we cannot see and may not even be able to comprehend even if someone points them out to us. Misconceptions are a basic example. It is very hard to avoid taking any contradictory evidence we encounter and distorting it so that it seems to support our initial misconception (Grant, 2009). We have to make predictions or engage in strangely structured discussions (What would it take to convince you to switch to a new view if you held this misconception? Grant, 2009) or otherwise be effectively challenged. Seek out such challenges to even your most seemingly solid ideas.

This inability to see new ways of thinking applies even to the general way we perceive reality. Suppose that you think that knowledge in general and science and math in particular are based on objective truth and is likely to be eternally true. You then might have deep trouble with the titles and core ideas in for example, Kline’s Mathematics: The Loss of Certainty, Ioannidis’ Why Most Published Research Findings Are False or Freedman’s Lies, Damned Lies, and Medical Science. Even deeper challenges would be presented by, among many others, Anderson’s Reality Isn’t What It Used To Be, or Baxter Magolda’s Authoring Your Life. But each of these implicitly or explicitly presents a metacognitive framework that can be very powerful once we master it. So my final hint today for metacognitive awareness is to play Elbow’s believing game: See if you can understand how an author comes to his or her conclusions even when they seem very different from your own. Or as Russell put it, the rationale for studying the history of philosophy is to understand how an intelligent person ever came to believe such things as a tool for recognizing the limits of one’s own beliefs. We need to do this broadly, not just historically.

 References

Anderson, Walter Truett. 1990. Reality Isn’t What It Used to Be: Theatrical Politics, Ready-To-Wear Religion, Global Myths, Primitive Chic, and Other Wonders of the Postmodern World. Harpercollins.

Arons, Arnold. Arons, A. B. 1976. Cultivating the capacity for formal operations: Objectives and procedures in an introductory physical science course. American Journal of Physics 44: 834-838.

Baxter Magolda, Marcia B. 2009. Authoring Your Life: Developing an Internal Voice to Navigate Life’s Challenges. Stylus.

Crouch, Catherine H., A. P. Fagen, P. Callan and E. Mazur. 2004. “Classroom Demonstrations: Learning Tools or Entertainment?” American Journal of Physics 72:835-838.

Elbow, Peter. 1973. Writing Without Teachers. Oxford University Press.

Freedman, David. 2010. Lies, Damned Lies, and Medical Science. Atlantic. http://www.theatlantic.com/magazine/archive/2010/11/lies-damned-lies-and-medical-science/308269/?single_page=true (or http://bit.ly/11aAmt0).

Grant, B. W. 2009. Practitioner Research Improved My Students’ Understanding Of Evolution By Natural Selection In An Introductory Biology Course. Teaching Issues and Experiments in Ecology, 6(4). http://tiee.ecoed.net/vol/v6/research/grant/abstract.html

Ioannidis, John. 2005. Why Most Published Research Findings Are False. PLoS Medicine August; 2(8): e124. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1182327/ (The most downloaded article in the history of PLoS Medicine.)

Kline, Morris. 1982. Mathematics: The Loss of Certainty. Oxford University Press.

Russell, Bertrand. 1945. A History of Western Philosophy. Simon & Schuster.


What Metacognitive Skills Do Developmental College Readers Need?

by Roman Taraban, Dmitrii Paniukov, and Michelle Kiser

Texas Tech University

In a recent post to the CASP (College Academic Support Programs) listserve, a skeptical developmental programs instructor asked why more attention can’t be given to remedial readers when designing instruction for developmental education. The instructor’s concern highlights the question:  What do we know about students who are not “college-ready” and who enroll in developmental coursework? In particular, where does metacognition fit into their development as skilled readers?

We know that reading ability, as measured by standardized instruments, like the SAT reading test for high-school students, is significantly associated with reading comprehension (Taraban, Rynearson, & Kerr, 2000). But what underlies this reading ability and can it be enhanced in college students?  Prior research revealed a several things.  As University students progress from freshman to senior years, they show small but significant growth in their use of metacognitive reading strategies (Taraban, 2011). This growth happens naturally – i.e., college students typically do not take courses that teach metacognition.  In trying to deliberately develop metacognitive reading strategies in developmental reading students, however, we found that the process can be slow and costly, but it can be done!  In a study of developmental college readers, it took roughly one semester of regular practice with a look-back reading strategy (Garner, 1987) in order to show significant improvement in reading comprehension (Taraban et al., 1997).  In addition to semester-long practice, the intervention was implemented in one-on-one tutoring, pointing to the instructional costs of bringing about detectable gains in reading skills in a remedial population.

Recently my colleagues and I had an opportunity to work with developmental readers who were enrolled in a developmental reading course at a major public research university. The students were primarily freshmen (mean number of completed credits = 16.7). We were primarily interested in three questions: 1) Could a teacher-implemented intervention improve these students’ comprehension and retention of ideas from expository texts? 2) Which metacognitive reading strategies did these students apply on their own? and 3) Was students’ use of metacognitive strategies associated with better retention of information?

The students were asked to read two expository passages and to recall as much as they could either immediately or after a 48-hour delay. They were told that they would be asked later to recall the information from the texts, but they were not prompted to apply any specific learning strategies. The two texts used for the study were each about 250 words in length and had an average Flesch-Kincaid readability score of 8.2 grade level. The passages contained 30 idea units each. Idea units are simple units of meaning derived from the text, and here were used to score the recall data. The participants read and studied one of the passages without interruption (Uninterrupted Condition), and they read and studied one of the passages paragraph-by-paragraph, and then all together (Segmented Condition).  Participants spent an equal amount of total time (10 minutes) reading and studying each of the texts. After they recalled the information, we asked them to report the strategies they used to learn the information. The specific self-reported strategies were organized into six types, as shown in Table 1. To score the strategy-use data, participants were given credit for multiple strategy types, but not for repetitions of the same strategy for the same text.

 TABLE 1: Key Types of Self-Reported Strategies

  1. REPETITION: Re-Reading; Memorize; Repetition
  2. FOCUSING ON SPECIFIC ELEMENTS: Key words; Key concepts; Grouping terms or sentences; Identifying related concepts; Parts that stood out; Parts that were difficult
  3. SELF TESTING: Summarizing; Recalling; Quizzing self; Forming acronyms
  4. GENERATING COGNITIVE ELABORATIONS: Activating prior knowledge; Recalling related experiences; Re-explaining parts of the text in other ways; Comparing and contrasting ideas; Using analogies; Using mental imagery
  5. SEGMENTATION: Grouping sentences for purposes of study; Divide by paragraph
  6. GENERAL: Read slowly; Read thoroughly; Concentrate; Understand passage

Regression analyses were conducted in order to evaluate the effectiveness of the reading approach (Uninterupted vs Segmented) in conjunction with participants’ self-reported use of the six strategy types (see Table 1). Turning to the immediate test, the reading approach mattered. When participants read and studied a segmented text they had significantly higher recall of idea units (M = 11.64) compared to non-segmented text (M = 7.93). Further, all of the participants reported using reading strategies.  Of the six strategy types, participants’ application of FOCUSING ON SPECIFIC ELEMENTS during reading was strongly associated with better recall of information from the text, and REPETITION was also important.  Considering the delayed test next, the reading approach used for the text that was read two days earlier did not matter.  However, using the strategy type SELF TESTING during reading was strongly associated with better recall, and FOCUSING ON SPECIFIC ELEMENTS was also helpful.

To address our skeptical developmental instructor, our data suggest that developmental reading instructors can structure how students process information in order to increase the number of ideas students retain, for follow-up activities like inferencing and brainstorming. The data also showed that developmental readers naturally use metacognitive reading strategies to boost their retention of information both immediately and at a delay.  Interestingly, there is no single best strategy.  Rather, FOCUSING ON SPECIFIC ELEMENTS during reading is most effective for immediate retention and SELF-TESTING during reading is most effective for longer-term retention.  Developmental students’ natural disposition to apply strategies may open opportunities for instructors to further guide, enhance, and channel these metacognitive skills to better benefit students.  What is heartening in these data is the finding that these academically-challenged students self-initiate metacognitive activities to monitor and regulate their study behaviors in order to enhance their academic performance.

References

Garner, R. (1987). Metacognition and reading comprehension. Norword, NJ: Ablex.

Taraban, R. (2011). Information fluency growth through engineering curricula: Analysis of students’ text-processing skills and beliefs. Journal of Engineering Education, 100(2), 397-416.

Taraban, R., Becton, S., Shufeldt, M., Stirling, T., Johnson, M., & Childers, K. (1997). Developing underprepared college students’ question-answering skills. Journal of Developmental Education, 21 (1), 20-22, 24, 26, 28.

Taraban, R., Rynearson, K., & Kerr, M. (2000). College students’ academic performance and self-reports of comprehension strategy use. Journal of Reading Psychology, 21, 283-308.


Making Thinking Visible

Ritchhart, R., Church, M., and Morrison, K. (2011). Making thinking visible: How to promote engagement, understanding, and independence for all learners. San Francisco: Jossey-Bass.

In Making Thinking Visible, the authors propose that we must make our students’ thinking visible in order to create places of intellectual stimulation.   To do this, the authors suggest first determining which modes of thinking are necessary for our disciplines or courses.  Then, through a series of research-based thinking routines, we can scaffold and support the development of individuals who can think, plan, create, question, and engage independently as learners.  If you are looking for both inspiration and pragmatic strategies, this book offers ideas that can be applicable to all educational settings and audiences.


Faculty Metacognition of Verbal Questioning

by Charity Peak, U.S. Air Force Academy*

Few faculty would argue that teaching requires asking questions of students, but rarely do instructors consider the what, how, or why of their verbal questioning behavior.  Without metacognition of questioning strategies, this foundational instructional technique can be wasted on habit rather than design.

Faculty question students for a variety of reasons.  Surprisingly, most faculty use verbal questioning as a classroom management technique.  This might look something like a machine gun approach, firing question after question in multiple directions in an effort to keep the class engaged.  See a student dozing? Fire!  Someone checking Facebook? Fire!  Some researchers estimate that teachers ask as many as 120 questions per hour—a question every 30 seconds (Vogler, 2005)!While this strategy may keep students on their toes, it does not necessarily aid student learning.  Often these questions are low level cognitive questions, requiring mainly recall of factual knowledge.  If teachers wish to develop deeper levels of thinking, they must stimulate their students’ own evaluation of the content rather than merely requesting regurgitation of the basics.

At the other end of the spectrum is a master teacher’s approach to instruction that utilizes a specific questioning taxonomy proven to be effective for a variety of disciplines.  Rather than using the run-and-gun approach, this faculty member masterfully leads students from one point to another through a series of thoughtfully derived questions.  He or she might start with the big picture and lead to a specific point or, in contrast, begin with minutia but guide students to one main relevant theme by the end of class.  Watching these instructors in action is often humbling.  However, even these most masterful teachers are often not cognitively aware of the strategies they are using.  They have figured out what works over time, but they likely can’t point to a specific methodology they were using to support their instruction.  Rather than shooting in the dark over many years, faculty would be wise to understand the metacognition behind verbal questioning if they wish to be effective in creating higher order thinking in their students.

Moving beyond simple recall in questioning is certainly good advice for creating more opportunities in thinking, but it’s easier said than done.  Faculty often report feeling uncomfortable trying new questioning strategies.  Asking higher order thinking questions for application, analysis, and synthesis often creates extensive dead air time in the classroom.  More difficult questions require more time to think, often in silence.  Also, students are reluctant to change the very well-established classroom culture of “getting the answer right.”  Based on years of classroom experience, students will often fire answers back, playing the game of “Guess what’s in the teacher’s head.”

Despite these cultural norms, it is possible through metacognition to improve verbal questioning.  Some scholars argue that faculty should understand some of the basic questioning taxonomies that exist and how they influence learning.  For example, asking open-ended versus closed-ended questions will alter the cognitive level of thinking and response (Rothstein & Santana, 2011).  Open-ended questions tend to achieve thinking which is higher on Bloom’s Taxonomy.  Students are required to generate thoughtful answers to questions as opposed to firing one to three word facts.  For example, instead of asking, “What is an adverb?” faculty might ask students to apply their learning by identifying an adverb in a sentence or even creating their own sentences using adverbs.  Better yet, The Right Question Institute (Rothstein & Santana, 2011) encourages faculty to get students to ask their own questions rather than teachers doing all the work.  After all, the person generating the questions is arguably the person who is learning the most.

Other scholars suggest that faculty should consider the sequencing and patterns that are possible when asking questions (Vogler, 2005).  For example, cognitive psychologists often suggest a funneling or convergent questioning technique, which leads students from big picture to details because it mirrors the cognitive functioning of the brain.  However, depending on the subject area, faculty may find success in guiding students from narrow to broad thinking (divergent) by first asking low-level, general questions followed by higher-level, specific questions.  Some disciplines lend themselves to using a circular path to force critical thinking in students.  This pattern asks a series of questions which eventually lead back to the initial position or question (e.g., “What is justice?”).  While students often find these patterns frustrating, it emphasizes to students the value of thinking rather than correctly identifying the right answer.

Ultimately, though, faculty would be wise to spend less energy on the exact strategy they plan to use and instead focus on the main goals of their questioning.  In Making Thinking Visible (Ritchhart, Church, & Morrison, 2011), the authors propose that the purpose of questioning is really to make our students’ thinking visible by understanding our own expert-level thinking—aka metacognition.   To do this, the authors suggest that instead of complex taxonomies and patterns, we should focus our efforts on three main purposes for questioning in our classes:

  1. Modeling our interest in the ideas being explored
  2. Helping students to construct understanding
  3. Facilitating the illumination of students’ own thinking to themselves (i.e., metacognition)

By asking authentic questions – that is, questions to which the teacher does not already know the answer or to which there are not predetermined answers – instructors create a classroom culture that feels intellectually engaging, fosters a community of inquiry, and allows students to see teachers as learners (31).  Faculty must frame learning as a complex communal activity rather than the process of merely accumulating information.  Thoughtful questioning creates this classroom climate of inquiry, but only if faculty are metacognitive about their purpose and approach to using this critical pedagogical strategy.  Without metacognition, faculty risk relying on the machine gun approach to questioning, wasting valuable class time on recall of factual information rather than elevating and revealing students’ thinking.

Ritchhart, R., Church, M., and Morrison, K. (2011). Making thinking visible: How to promote engagement, understanding, and independence for all learners. San Francisco: Jossey-Bass.

Rothstein, D., and Santana, L. (2011). Make just one change: Teach students to ask their own questions. Boston: Harvard Education Press.

Vogler, K. E. (2005). Improve your verbal questioning. The Clearing House, 79(2): 98-103.

 

* Disclaimer: The views expressed in this document are those of the authors and do not reflect the official policy or position of the U. S. Air Force, Department of Defense, or the U. S. Govt.


Using metacognitive writing assignments to improve course performance

Mynlieff, Manogaran, St. Maurice, and Eddinger discuss the use of metacognitive writing exercises in large biology classes. Students were asked to explicitly consider why they made mistakes on exams and discuss why another answer would have been more appropriate. Students completing these assignments showed marked improvement in subsequent course assessments.

Mynlieff, M., Manogaran, A. L., Maurice, M. S., & Eddinger, T. J. (2014). Writing Assignments with a Metacognitive Component Enhance Learning in a Large Introductory Biology Course. CBE-Life Sciences Education13(2), 311-321.

 


Using metacognitive awareness to facilitate healthy engagement with moral issues

By John Draeger, SUNY Buffalo State

As the new semester begins, I am again looking out on a classroom full of students eager to discuss “hot button” moral issues (e.g., abortion, euthanasia, hate-speech, same-sex marriage, drug legalization). In an earlier post entitled, “Using metacognition to uncover the substructure of moral issues,” I argued that metacognitive awareness can help students move beyond media pundit drivel and towards a more careful consideration of moral issues. In “Cultivating the habit of constructive discomfort”, I argued that learning requires cultivating a certain healthy discomfort (much like the discomfort often associated with vigorous exercise) and it is metacognitive awareness that keeps us within our own “zone of proximal development” (Vygotsky 1978). This post considers some of the sources of discomfort that threaten to undermine the discussion of moral issues.

Confronting “hot button” moral issues can be difficult because each of us brings our own complicated history to the conversation (replete with hang ups and blind spots). Based on my many years of teaching moral philosophy, I offer the following list of items that I found seem to derail discussion. The list is by no means exhaustive and whether these are the elements most likely to impede engagement is ultimately an empirical question that the needs to be answered. However, I argue that all of us (instructors, students, those outside the classroom) need to be aware of our own sources of discomfort with moral matters if we hope to move beyond them and towards a healthy engagement with these important issues.

Sources of discomfort: 

(1) Entrenched beliefs— some moral issues are difficult to consider because they force us to confront our foundational values.  For example, those from a wide variety of religious traditions can find it difficult to be completely open-minded to the possibility that abortion and same-sex marriage could be permissible. While they can summarize a particular position on the issue (e.g., for a particular course assignment), many find it difficult to move beyond a “bookish” articulation of the problem towards a genuine consideration of the issues because it threatens to undermine other firmly held beliefs (e.g., religious teachings).

(2) Peer pressure — many students find it difficult to swim against the current of peer opinion. When discussing sex, for example, students want to avoid being seen as either too prudish or too perverted. Sometimes students have views that fall outside the range of perceived acceptability but they refuse to voice them for fear of social disapproval. Other times, it doesn’t even occur them to consider anything outside the norm. In both cases, peer pressure can undermine full consideration of the issues.

(3) Self-interest — shifts in moral position require changes in our behavior. For example, “buying into” arguments for animal rights might demand that we change our eating habits. Often, it is easy to discount these arguments, not because they lack merit, but because we do not want to make the lifestyle changes that might be required if we became convinced by the argument.

(4) “Afraid of looking in the mirror” — discussions of moral issues can reveal uncomfortable truths about ourselves. Discussions of racial and gender discrimination, for example, can make us uncomfortable when we realize that we (or those we love) have attitudes and behaviors are insensitive and even hurtful.

(5) Ripple effects — because moral issues are interrelated, modifying our view on one issue can send ripple effects through our entire conceptual system.  For example, a discussion of euthanasia might lead us to the conclusion the quality life is important and even that some lives are no longer worth living (e.g., extreme pain without the prospect of relief). If true, then we might come to believe that it be better if some people were never born (e.g., extreme pain without the prospect of relief). Thus, thinking carefully about euthanasia might change our view of abortion. Likewise, becoming convinced by arguments for individual freedom in one area (e.g., free speech) can lead us to rethink our views in other areas (e.g., drug legalization, abortion, hate speech). However, if a student senses that a ripple might turn into a tidal wave, they often disengage.

In each case, becoming aware of the sources for our discomfort can help us move beyond a superficial consideration of the issues. In particular, asking a series of metacognitive questions can help uncover whether the discomfort is healthy (e.g., struggle with unfamiliar or difficult material) or unhealthy (e.g., blocked by entrenched beliefs, peer pressure, self-interest, or an inability to look in the mirror).

Questions we might ask our students (or even ourselves):

  • To what extent is my thinking on particular issue being influenced by my firmly held beliefs, the views of my peers, self-interest, a reluctance to take an honest look in the mirror, or concerns about the need revise my entire ethical system?
  • Am I taking the moral issue under consideration seriously? Why or why not?
  • Would I be willing to change my stance if the argument was compelling? Why or why not?
  • Is there something about the view that I cannot bring myself to consider? If so, what?

While awareness of our various blind spots and areas of discomfort will not automatically improve the quality of discussion, it can pave the way for a more meaningful consideration of the issues. As such, metacognitive awareness can facilitate healthy engagement with moral issues.

Reference:

Vygotsky, L. S. (1978). Mind and society: The development of higher mental processes. Cambridge, MA: Harvard University Press.


Testing Improves Knowledge Monitoring

by Chris Was, Kent State University

Randy Isaacson and I have spent a great deal of time and effort creating a curriculum for an educational psychology class to encourage metacognition in preservice teachers. Randy spent a number of years developing this curriculum before I joined him in an attempt to improve the curriculum and use the curriculum to test hypotheses regarding improvement of metacognition with training for undergraduate preservice teachers. A detail description of the curriculum can be found in the National Teaching and Learning Forum (Isaacson & Wass, 2010), but I wanted to take this opportunity to give a simple overview of how we structured our courses and some of the results produced by using this curriculum to train undergraduates to be metacognitive in their studies.

With our combined 40+ years of teaching, we our quite clear that most undergraduates do not come equipped with the self-regulation skills that one would hope students would acquire before entering the university. Even more disappointing, is students lack the metacognition required to successfully regulate their own learning behaviors. Creating an environment that not only encourages, but also requires students to be metacognitive is not a simple task. However, it can be accomplished.

Variable Weight-Variable Difficulty Tests

The most important component of the course structure is creating an environment with extensive and immediate feedback. The feedback should be designed to help the student identify specific deficiencies in his or her learning strategies and metacognition.  We developed an extensive array of learning resources which guide the student to focusing on knowing what they know, and when they know it. The first resource we developed is a test format that helps the students reflect and monitor their knowledge regarding the content and items on the test. In our courses we have students judge their accuracy and confidence in their responses for each item and having them predict their scores for each exam

Throughout the duration of the semester in which they were enrolled in the course students are administered a weekly exam (the courses meet Monday, Wednesday and Friday with the exams occurring on Friday). Each examination is based on a variable weight, variable difficulty format. Each examination contained a total of 35 questions composed of 15 Level I questions that were at the knowledge level, 15 Level II questions at the evaluation level, and 5 Level III questions at the application/synthesis level. Scoring of the exam was based on a system that increased points for correct responses in relation to the increasing difficulty of the questions and confidence in responses: Students choose 10 Level I questions and put those answers on the left side of the answer sheet. These 10 Level I questions are worth 2 points each. Ten Level II questions were worth 5 points each are placed on the left side of the answer sheet, and three Level III questions were worth 6 points each are placed on the left. Students were also required to choose the questions they were least confident about and place them on the right side of the answer sheet. These questions were only worth one point (5 of the 15 Level I and II questions, and 2 of the 5 Level III questions). The scoring equaled a possible 100 points for each exam. Correlations between total score and absolute score (number correct out of 35) typically range from r = .87 to r = .94.  Although we provide students with many other resources to encourage metacognition, we feel that the left-right test format is the most powerful influence on student knowledge monitoring through the semester.

The Results

Along with our collaborators, we have conducted a number of studies using the variable weight-variable difficulty (VW-VD) tests as a treatment. Our research questions focus on whether the test format increases knowledge monitoring accuracy, individual differences in knowledge monitoring and metacognition, and psychometric issues in measuring knowledge monitoring. Below is a brief description of some of our results followed.

Hartwig, Was, Isaacson, & Dunlosky (2011) found that a simple knowledge monitoring assessment predicted both test scores and number of items correct on the VW-VD tests.

Isaacson & Was (2010) found that after a semester of VW-VD tests, knowledge monitoring accuracy on an unrelated measure of knowledge monitoring increased.


Incorporating Metacognitive Leadership Development in Class

by Lauren Scharff, U.S. Air Force Academy*

During the spring 2014 semester I decided to try an explicitly metacognitive approach to leadership development in my Foundations for Leadership Development course in the Behavioral Sciences department at the United States Air Force Academy.

I had taught the course twice before and had many discussions with other course instructors. Overall, my sense was that many of our students didn’t intentionally and systematically connect what they were doing and learning in the course with their own personal leadership development. This is despite a paper that focused on a personal leadership development plan, and a video project that focused on implementing positive change within their squadrons.

This course is an upper-level, required core courses in our curriculum, and my section was one of more than 30 with approximately a dozen different instructors teaching sections. At our institution, much of the course structure within core courses is standardized across instructors, but I had 20% of the points with which to do what I desired, as long as 10% somehow assessed accountability for lesson preparation.

I was aware of a foundation of research in skill development (e.g. Svinicki, 2004), so I knew that in order to most effectively develop skills, people need multiple opportunities for practice coupled with feedback.  Feedback leads to awareness of strengths, shortcomings, and possible alternate strategies. This understanding of skill development became intertwined with my increasing focus on metacognitive approaches. I came to the conclusion that perhaps part of the less-than-ideal student connection to the course material and objectives occurred because our course activities that were designed to support that connection didn’t provide (require) enough opportunities for practice and continued awareness, especially beyond the classroom and course requirements.

As I prepared for the semester I drew on resources from The Learning Record, which outlines Five Dimensions of Learning that connected well with goals I had for my students’ leadership development: confidence and independence, knowledge and understanding, skills and strategies, use of prior and emerging experience, and critical reflection. The site also shares well-developed activities and assignments that supported my goal of using a metacognitive approach to promote my students’ leadership development.

Ultimately, I designed my course to be centered around journal entries, which I also completed.  During each lesson we shared our understandings, questions, and reflections based on the readings, as well as examples of personal observations of leadership and our reflections on how what we were learning might be effectively applied to real situations. More specifically, the journal entries included 1) answers to guided reflection questions about each reading for each lesson, and 2) at least two personal leadership observations and analyses each week. I created a simple grading system so that I wouldn’t be overloaded with assessing journal entries every lesson. (Journal assignment)

A quick poll of my students (N=13) indicated that none of them regularly kept personal journals, and only two had ever had any sort of journal assignment for a class. Thus, this requirement for regular journal writing required a change of habit for them that also represented increased time and energy for class preparation. Although there was some adjustment, when I asked for feedback after two weeks of class, the students were unanimous in their agreement that they were more deeply reading than if I had incorporated reading quizzes for accountability and that they preferred to continue the personal and reading reflections even though they involved frequent writing. Discussion during class was deeper and more engaging than in previous semesters.

Twice during the semester, students wrote evidence-based personal development evaluations, based on a shared example from The Learning Record. Students chose examples from their journals to support their evaluations of their own leadership development. These evaluation exercises forced them to thoughtfully review their observations across the weeks of the semester and develop ongoing awareness of their leadership strengths and weaknesses as well as an understanding of alternate strategies and when/how they might be useful for their leadership efforts. (Personal Development Evaluation assignment)

I also added a question each time that had them evaluate the journal approach and course design.  We made some tweaks at mid-semester. By the end of the semester, all but one student reported that the journal entries deepened their learning and personal awareness of their leadership development. While I will likely make some further tweaks in future semesters, I believe that this approach was a success, and that it could be scaled up for larger classes (see the simplified grading scheme in the Journal assignment). Below are sample comments from the final evaluation assignment (released with student permission):

“The leadership journal has had a tremendous effect on my personal development as a leader.  The journal has made me aware of my strengths and weaknesses…. The journal allowed me an avenue to give time and actually think about how I am doing as a leader and peer within my environment.”

 

“The personal observations were definitely helpful for documenting our successes and failures, which we can look back upon and improve. This relates not only to our personal leadership development, but to how we learn about ourselves.”

 

“These journals have taken all of us on a journey through the semester.  They undoubtedly began as something that we disliked and looked down upon each week.  However, I have really grown to love and understand this application of leadership growth.  They not only provide a chance for us to look back on our leadership gains and failures, they offer an opportunity for us to challenge ourselves in order to write about the things we want to see in ourselves.  The journals have become much more than a simple task of writing on a week-to-week basis.  They have grown into an outpouring of our character and lives as we turn the page from underclassmen to upper-class leaders and eventually to lieutenants in a few short months.  I believe that these journals are also a metaphor for many leadership challenges in that they will be frequent, difficult, and time consuming, but in the end they will let us all grow.  ….my reflections are not simply babble, …they actually represent significant growth and understanding of myself.”

References:

Svinicki, Marilla. 2004. Learning and Motivation in the Postsecondary Classroom. Bolton, MA: Anker Publishing Co, Inc.

The Learning Record: http://www.learningrecord.org/

————————————-

* Disclaimer: The views expressed in this document are those of the authors and do not reflect the official policy or position of the U. S. Air Force, Department of Defense, or the U. S. Govt.

 


Assessing Metacognitive Awareness

Constructed by Rayne Sperling and Gregory Schraw (1994), the Metacognitive Awareness Inventory (MAI) is a well established and useful assessment of metacogntion. The MAI has been used in hundreds of studies, ranging from basic to applied research. It is a 52-item inventory with two broad categories (i.e., knowledge of cognition and regulation of cognition), with several sub-categories.

Schraw, G., & Dennison, R. S. (1994). Assessing metacognitive awareness.Contemporary educational psychology19(4), 460-475.

 


The Effects of Metacognition and Concrete Encoding Strategies on Depth of Understanding in Educational Psychology

Suzanne Schellenberg, Meiko Negishi, and Paul Eggen (2011) from the University of North Florida describe a useful method to increase the metacognition of their students. They found that when educational psychology students were taught specific encoding strategies they academically outperformed a control group in learning course material.

Schellenberg, S., Negishi, M., & Eggen, P. (2011). The Effects of Metacognition and Concrete Encoding Strategies on Depth of Understanding in Educational PsychologyTeaching Educational Psychology7(2), 17-24.


Changing Epistemological Beliefs in Pre-service Teacher Education Students

Joanne Brownlee, Nola Purdie, and Gillian Boulton-Lewis (2010) describe an interesting method to increase student’s epistemological beliefs using reflective journal assignments. Brownlee and colleagues found that when students engaged in these reflective practices, they had significantly improved their epistemological beliefs over that of students who did not complete these activities.

Brownlee, J., Purdie, N., & Boulton-Lewis, G. (2001). Changing epistemological beliefs in pre-service teacher education studentsTeaching in higher Education,6(2), 247-268.


Metacognition and Reflective Thinking

By Steven C. Fleisher, California State University Channel Islands

Imagine that we are reading an assignment. As we read, do we think: “How long will this take?” “Will this be on the test?” If so, try this instead. Presume that we are reading the article as preparation for meeting later with an important person such as our supervisor to discuss the article. How would this situation change the questions we ask ourselves? Such thinking can make us aware of what constitutes satisfactory mastery of knowing and how to achieve it.

Think back for a moment to learning a psychomotor skill, such as learning to ride a bicycle. It is normal to master that skill with normal innate balance and strength. We might think: “That’s all there is to it.” However, watching cyclists in a serious bicycle race or triathlon, reveals that reliance only on innate ability cannot produce that kind of performance. That level of expertise requires learning to pedal with cadence, to deliver equal power from both legs, use the gearing appropriately, exploit position within a group of racers and pace oneself relative to challenges. Untrained innate ability can rarely get us far in comparison to the results of informed training.

The same is true in learning. Metacognitive skills (learnable skills) enhance academic performance. People with metacognitive skill will usually outperform others who lack such skill, even others with greater innate intelligence (natural ability). Metacognitive training requires developing three strengths: 1) metacognitive knowledge, 2) metacognitive monitoring, and 3) metacognitive control.

Metacognitive knowledge refers to our understanding about how learning operates and how to improve our learning. We should have enough of this knowledge to articulate how we learn best. For example, we can know when it is best for us to write a reflection about a reading in order to enhance our learning. We should be alert to our misconceptions about how our learning works. When we learn that cramming is not always the best way to study (Believe it!), we must give that up and operate with a better proven practice.

Metacognitive monitoring refers to developed ability to monitor our progress and achievement accurately. For example, self-assessment is a kind of metacognitive monitoring. We should know when we truly understand what we are reading and assess if we are making progress toward solving a problem. When we become accurate and proficient in self-assessment, we are much better informed. We can see when we have mastered certain material well enough, and when we have not.

Metacognitive control. This competency involves having the discipline and control needed to make the best decisions in our own interests. This aspect of metacognition includes acting on changing our efforts or learning strategies, or taking action to recruit help when indicated.

Putting it together. When we engage in metacognitive reflection, we can ask ourselves, for example, “What did we just learn?” “What was problematic, and why?” “What was easy, and why?” “How can we apply what we just learned?” Further, when we gain metacognitive skill, we begin to internalize habits of learning that better establish and stabilize beneficial neural connections.

Reflective Exercises for Students:

  1. Metacognitive knowledge. Consider three learning challenges: acquiring knowledge, acquiring a skill, or making an evidence-based decision. How might the approaches needed to succeed in each of these three separate challenges differ?
  2. Metacognitive monitoring. After you complete your next assignment or project, rate your resultant state of mastery on the following scale of three points: 0 = I have no confidence that I made any meaningful progress toward mastery; 1 = I clearly perceived some gain of mastery, but I need to get farther; 2 = I am currently highly confident that I understand and can meet this challenge.
  3. Next, see if your self-rating causes you to take action such as to re-study the material or to seek help from a peer or an instructor in order to achieve more competence and higher confidence. A critical test will be whether your awareness from monitoring was able to trigger your taking action. Another will come in time. It will be whether your self-assessment proved accurate.
  4. Metacognitive control. To develop better understanding of this, recall an example from life when you made a poor decision that proved to produce a result that you did not desire or that was not in your interests. How did living this experience equip you to better deal with a similar or related life challenge?

References

Chew, S. L. (2010). Improving classroom performance by challenging student misconceptions about learning. Association for Psychological Science: Observer, Vol. 23, No. 4. http://psychologicalscience.org/observer/getArticle.cfm?id=2666

Dunlosky, J. and Metcalf, J. (2009). Metacognition. Thousand Oaks, CA: Sage.

Leamnson, R. N. (1999). Thinking about teaching and learning: Developing habits of learning with first year college and university students. Sterling, VA: Stylus.

Pintrich, P. R. (2002). The role of metacognitive knowledge in learning, teaching, and assessing. Theory Into Practice, 41(4), 219-225.

Wirth, K. (2010). The role of metacognition in teaching Geoscience. Science Education Resource Center, Macalester College. http://serc.carleton.edu/NAGTWorkshops/metacognition/activities/27560.html


Calls for Research Collaboration

If you have a planned, new, or ongoing project for which you’d like collaborators from other institutions, submit a short description and explain the type of collaboration you are requesting. For example, you might need statistical assistance, you might want someone to try your approach using a different student population, etc.

Use the comment feature to submit your post.


Request for Assistance in Project Design

If you are interested in designing a research project that investigates the impact of your incorporation of metacognitive strategies in your classroom, use the comment feature to submit a short description of your idea and questions. The Improve with Metacognition creators and consultants will try to assist you, and other site visitors might also comment on your post. Note that it is your responsibility on your end to apply for and receive IRB approval for the ethical use of human participants.