Metacognition at Goucher I: Framework and Implementation

by Jennifer McCabe & Justine Chasmar, Goucher College

Goucher College, a small private liberal arts college in Baltimore, Maryland, has in recent years focused curricular and co-curricular endeavors around metacognitive principles. Shortly after the appointment of Dr. Jose Antonio Bowen as Goucher’s President in 2014, he introduced a framework for all campus endeavors called the “New 3Rs”: Relationships, Resilience, Reflection. President Bowen charged all stakeholders at the college to enhance and highlight elements of the Goucher experience that already held these values; and to intentionally continue to build a community guided by the 3Rs on multiple interacting levels.

For this first blog about how Goucher College has framed the student experience around improving metacognition, we will focus on the 3rd “R”, Reflection, and discuss how this guiding metacognitive principle is embedded in our new curriculum and drove the creation of a new student and faculty support centers on campus.

Reflection is at the core of the Goucher Commons Curriculum, introduced in 2017 (https://www.goucher.edu/learn/curriculum/). Across the curriculum, and scaffolded over years spent at Goucher, students intentionally reflect in various ways, many of which would ideally lead to metacognitive development. In required first-year seminar courses, students are supported in considering their own place and privilege in society, and how this shapes their thought processes and views on the world. The newly developed “Center-Pair Explorations” courses form the breadth of general education study across non-major disciplines; in these courses, students are encouraged to reflect on how disciplinary and interdisciplinary methodologies lead to scholarly innovation. Indeed, the mission of these (and other) courses is for students to develop the skill of  collaborating on complex problems with others who are different from them (https://www.goucher.edu/learn/curriculum/student-learning-goals-and-outcomes/college-learning-goals); this inherently reflects elements of metacognition, with regard to identifying environments and strategies for successful goal-attainment, along with awareness of situations when additional information and/or a change in strategy is needed.

Reflection is a key embedded component in the required Study Abroad experience at Goucher as well – students are explicitly asked to frame their experience from multiple perspectives, to integrate it with their own knowledge and identity, and to think about what they are learning before, during, and after their time abroad. And, as a culminating experience at Goucher, seniors complete a Capstone project either through our Leadership Capstone program or through their disciplinary major. The key questions for the Leadership Capstone are: Who are you now? and Who were you four years ago? Students are asked to articulate this journey of change in an oral presentation. For example, a psychology major may discuss how the combination of coursework, internship experiences, faculty mentorship, and on-campus co-curricular activities shaped her academic and personal journey, and has led to a path forward from Goucher. The specifics of Disciplinary Capstone experiences vary by major (e.g.,  research project, critical literature review, performance piece, community intervention), but inherently the Capstone requirement involves reflective, integrative, and critical thinking. The processes involved in such advanced academic work are equally cognitive and metacognitive in nature.

Ultimately, we want Goucher students to show metacognitive sophistication in understanding their own learning and thinking. Through the use of newly implemented e-portfolios, which are begun and supported with explicit instruction in first-year courses, along the way students will be able to look back at various steps in their learning experiences to reflect on improvements and growth areas. And we hope that by recently changing the name and framing of our end-of-semester student surveys from the more traditional “Course Evaluations” to the metacognitively-focused “Student Reflections,” we are more explicitly encouraging students to think about their learning experiences from multiple perspectives. Instead of focusing on strengths and weaknesses of the course and instructor, the framing starts with the student perspective: what they brought to the course, what they thought they were supposed to learn, how well they think they learned it, and ways in which the course and instructor supported their learning.

In addition to the ways in which Goucher has embedded metacognitive principles into the curriculum and academic practices, structural change has also occurred through the creation of new centers for faculty and student support. Several years ago, the Provost and select faculty planned for two new campus centers – the Center for the Advancement of Scholarship and Teaching (CAST) and the Quantitative Reasoning Center (QR Center). In the external search for directors, both position advertisements emphasized the desire for candidates who understood not only the pertinent content area (e.g., math and data skills for the QR Center) but also a nuanced understanding of growth mindset and metacognitive principles important to student (and faculty) success.

CAST opened its doors in January 2017, to support faculty in teaching and research endeavors. This center, led by Dr. Robin Cresiski, sponsors a variety of workshops, including those that help faculty develop the Center-Pair Exploration courses discussed above. In addition, CAST holds informational and interactive sessions to help faculty understand evidence-based practices for student learning, as well as how to support students in developing metacognitive skills – in particular, the essential lifelong skill of learning how to learn. For example, the popular Transparent Assignment Workshops invite instructors to bring existing assignment instructions, and then use peer feedback to improve with regard to clarity of purpose, directions, and learning measurements. In this way, CAST is helping the educators at Goucher develop more sophisticated metacognition about how to translate their own knowledge into effective and inclusive learning opportunities for students, built on understanding the conditions under which students (humans) learn best.

The QR Center debuted as a student-facing center in August 2017, with Dr. Justine Chasmar (co-author of this blog) as inaugural director. The QR Center supports students with their quantitative skill and content development across all disciplines at Goucher, with a focus on promoting quantitative literacy (www.goucher.edu/qrcenter). To foster these skills, the QR Center offers programming including tutoring, workshops, and academic consultations, using peers (called Q-tutors) as the primary medium of support. Q-tutors complete a training course that combines scaffolded reflection and practical exercises. A major focus of Q-tutoring is on fostering independence through the use of self-regulated learning and study strategies. Q-tutors are trained to teach and model study strategies such as self-testing and metacognitive monitoring, and to support student reflection through “checking for understanding” activities in each tutoring session. Metacognitive reflection is a guiding principle for tutor training and all student programming at the QR Center. A second blog in this series will focus on specifics of Q-tutor training at Goucher (Metacognition at Goucher II: Training for Q-Tutors).

Taken together, these programmatic and structural campus initiatives help to form multiple levels of interacting metacognitive support for our community of learners. Indeed, President Bowen often states the goal that Goucher helps students become “voracious self-regulated learners” (https://blogs.goucher.edu/intheloop/9210/goucher-lauded-by-doe-for-helping-all-students-find-success/). By naming Reflection as one of the “New 3Rs,” metacognition has become an embedded part of campus life with the potential to benefit all constituencies.

Recommended Reading

Bowen, J. A. (2012). Teaching naked: How moving technology out of your college classroom will improve student learning. San Francisco: Jossey-Bass.

Bowen, J. A., & Watson, C. E. (2017). Teaching naked techniques: A practical guide to designing better classes. San Francisco: Jossey-Bass.

Bowen, J. A. (2020). A new 3Rs: Using behavioral science to prepare students for a new learning economy. Baltimore, Maryland: Johns Hopkins University Press.


How can I help students become more expert learners, so they engage in active learning?

by Stephanie Chasteen, University of Colorado Boulder

This chapter focuses on helping students engage productively in active learning classrooms by teaching students reflect on their learning and develop productive mindsets towards learning. It is part of a series on helping students engage productively in active learning classrooms.” It includes a list of tangible teaching and student metacognition strategies to use when working with students.


Metacognitive support for HIP student learning communities

by John Draeger, SUNY Buffalo State

In a previous post, I argued that metacognition can support undergraduate research because it encourages students to become aware of the inquiry process and it can help students make meaningful adjustments when things go off the rails (Draeger, 2018). Like undergraduate research, student learning communities are on the Association of American Colleges and Universities (AAC&U) list of high-impact practices (HIP). They make the list because they require multiple interactions between faculty and students about substantive matters as well as frequent, constructive feedback from faculty, and regular, structured processes for reflection and integration (Kuh 2008; Kilgo, Sheets & Pascarella 2015). In a similar vein, this post argues that instructors and students can benefit from being more metacognitive about their involvement in learning communities. While learning communities can take various forms, they involve groups of students taking a common set of courses at the same time with the same instructors. Learning communities aim to integrate learning experiences across courses in the community.

Sample models of student learning communities

Some models of learning communities involve groups of students taking a collection of courses co-taught by the same instructors. The co-teaching model promotes coordination and communication between instructors about course design, instruction, and assessment. Because students and instructors are present for class sessions in each of the courses, there are plenty of opportunities to make cross-disciplinary observations. Students, for example, can watch as instructors approach a common reading from very different points of view. However, the co-teaching model is often not feasible at many institutions. Another model of learning community requires that a cohort of students take some of the same courses taught by the same instructors, but the courses are not co-taught. Because faculty are rarely in the same room at the same time, I would argue that it is all the more important that they take a metacognitive approach to their student learning community involvement.

Strategies for building metacognition into learning communities

At SUNY Buffalo State, we’ve developed a series of workshops and related materials to promote greater coordination and integration across student learning community courses. The following are just a few of those strategies. (Anyone interested in learning more about resource materials can contact me at draegejd@buffalostate.edu).

First, instructors can review the learning outcomes for each of the courses to look for points of similarity and departure. Points of convergence might be around content (e.g., themes that run through each of the courses) or around skills (e.g., reading, writing, critical thinking). Becoming aware of learning outcomes could, for example, lead to a conversation between instructors about how to reinforce what the other is doing. It could also alert them to places where they might inadvertently undermine the other’s efforts. Reviewing the learning goals emphasizes the importance of looking for opportunities to make explicit connections across each course. Awareness isn’t everything, but it can open space for the possibility of making meaningful adjustments.

Second, instructors can share the core ideas that are at the heart of their courses and that organize other course elements (Nosich, 2012). Identifying these fundamental ideas and being explicit about them with students is important because these ideas serve as anchor points, especially when students struggle. However, fundamental ideas can also serve as important landmarks across courses. Even if instructors cannot discuss another’s content with nuance, they can intentionally make connections to the big ideas. Better yet, instructors can take a “integration time-out” by asking students to relate the material in the current class to the fundamental concepts in each of the other courses. In this case, instructors are aware of the importance of integration and looking for opportunities to intentionally make connections with the key elements of another’s course.

Third, instructors can discuss how they approach giving feedback to students. It is no secret that frequent feedback promotes learning within a course, but students can also benefit from instructors being aware of what other instructors are doing. For example, instructors might use slightly different terminology to talk about similar things. Through conversation, they may decide to adopt a common lexicon. In this case, awareness promotes minor adjustments. In other cases, instructors might want to keep to their own way of doing things. However, they might be more explicit about how and why similar situations are being handled differently in different courses. The hope is that this will keep students from inadvertently going off the rails. It can also reinforce the notion that learning can be effective, albeit different, in differing contexts.

Fourth, instructors can explore why and how they promote student reflection. For example, some courses seek to exposure to new ideas, while others consider the complexity of a more focused set of ideas. Within a course, it is important to be explicit with students about the type of reflection between encouraged (e.g., deep, wide). It is also important to be explicit about structured reflections (e.g., deep, wide) across the learning community courses. Is the goal to keep a running list of the various ways the content and skills in each course are similar and different? This approach speaks to the breadth of knowledge across fields of study and captures the sense that individual students can make meaningful connections in a wide variety of ways. Or is the goal to focus on the finding the important connections between the fundamental concepts in each course? This approach speaks to the importance of sustained conversation about a narrow set of issues from multiple points of view. Both forms of reflection can be valuable, but instructors need to be intentional and explicit about structuring those experiences within and across their courses.

HIP student learning communities

If implemented well, learning communities can be HIP because they encourage students to consider the learning connections between their courses. I argue that metacognition can help instructors intentionally design and explicitly structure integrative learning opportunities. Metacognition can also help students become increasingly aware of similarities and differences across academic disciplines. In this way, metacognition and learning communities offer students the opportunity to learn how to make connections within and across fields of inquiry. Because the ability to make such connections is a hallmark of a lifelong learner, promoting metacognition through learning communities has the potential to be highly impactful in a student’s life for years to come.

References

Draeger, J. (2018). Metacognition supports HIP undergraduate research. Improve with Metacognition. Retrieved from https://www.improvewithmetacognition.com/metacognition-supports-hip-undergraduate-research/

Healey, M., & Jenkins, A. (2009). Developing undergraduate research and inquiry. York: HE Academy.

Kilgo, C. A., Sheets, J. K. E., & Pascarella, E. T. (2015). The link between high-impact practices and student learning: Some longitudinal evidence. Higher Education, 69(4), 509-525.

Kilgo, C. A., & Pascarella, E. T. (2016). Does independent research with a faculty member enhance four-year graduation and graduate/professional degree plans? Convergent results with different analytical methods. Higher Education, 71(4), 575-592.

Kuh, G. D. (2008). Excerpt from high-impact educational practices: What they are, who has access to them, and why they matter. Association of American Colleges and Universities.

Nosich, G. (2012) Learning to think things through: A guide to critical thinking across the disciplines. Saddle River, N.J.: Prentice Hall.

 


Addressing Student Resistance to Engaging in their Metacognitive Development

by Patrick Cunningham, Ph.D., Rose-Hulman Institute of Technology

You may be familiar with the quip,

“You can lead a horse to water, but you can’t make it drink.”

Perhaps you can’t, however, my grandfather argued, “but you can put salt in its oats!” We can advise students on the importance of setting specific learning goals and accurately monitoring both their level of understanding and their learning processes. And I believe we should teach them how to be more metacognitive, but we can’t make them do any of it. Nor do I think we should. Students should own their learning. They should experience agency and efficacy in their learning (i.e., they should own their learning). But I can put “salt in their oats!” In this post I want to explore our role, as educators, in encouraging and providing opportunities for students to grow their metacognitive awareness and skills (i.e., our role as purveyors of “learning salt”).

I recently found the book Why Students Resist Learning (Tolman & Kremling, 2017). While written about resistance to learning in general, it is relevant to student resistance to engaging in their metacognitive development. Student resistance is complex with multiple interacting components. In my reading so far I have been challenged by two overarching themes. First, student resistance isn’t just about students. It’s about us, the educators, too. Our interactions with students can exacerbate or ameliorate student resistance. Second, student resistance is a symptom of deeper issues, not a student characteristic itself. For example, a student may be trying to preserve their sense of self and fear admitting a learning deficiency or a student may have had prior experiences that affirm surface approaches to learning and therefore they resist the idea that they need strategies to develop deeper learning.

We, as educators, need to recognize and deal with our role in student resistance to metacognitive development. Our interactions with our students are largely influenced by our beliefs and attitudes about our students. My colleagues and I have sought to address this in the B-ACE framework for giving formative feedback in support of metacognitive development. The ‘B’ represents an attitude of Believing the best about students. When we prepare to give feedback, we are responding to what they have written or said, which may or may not be accurate or complete. Believing the best acknowledges that we have incomplete information and need to reserve judgement. This attitude embodies sincere curiosity and seeks understanding. The remaining letters represent actionable elements of feedback, Affirm-Challenge-Encourage. Implementing our belief in the best about our students, we should seek to authentically affirm positive behaviors and growth, however small. Then explore and seek to understand the broader contexts and details of their statements by asking questions. In this way, you can provide gentle challenge to think more deeply or to discover incongruities between learning goals and behaviors. Finally, close by encouraging them. Let your students know you believe in their abilities to become more skillful learners, with effort and perseverance. If you say it, make sure you mean it. You can also point them to potential strategies to consider. Let’s see how we can implement the B-ACE framework as “learning salt”.

In my teaching, I provide a variety of opportunities for my students to engage in their metacognitive development. At some point I ask something like, “What have you been doing differently since we last talked? How is it helping you be a more skilled and efficient learner?” One common type of response I get from engineering students is exemplified by:

“I am continuing to work practice problems to get ready for exams. I try to work through as many as I can. It works best for me.”

Okay. No change. I’m disappointed. First, I need to make sure I don’t assume they are just memorizing and pattern matching, i.e., relying on surface learning approaches. Or, if they are memorizing and pattern matching, I need to believe it is in honest effort to learn. Further, change is hard and they may be trusting what is familiar and comfortable, even if it isn’t the most effective and efficient. Now I need to ACE the rest of the feedback.

[Affirm] Good! You are taking intentional steps to prepare for your exams. [Challenge] How do you know it works best? What other strategies have you tried? [Encourage] Keep being intentional about your learning. You may want to try recall-and-review, explaining-to-learn, or creating your own problems to measurably test your understanding.

There will be a difference between written feedback and oral feedback, but notice that both include an opening for further interaction and prompt metacognitive reflection. In a face-to-face dialogue, there might be other questions depending on the responses, such as, “How are you working the problems? What will happen if the problem is asked in a way that is different from your practice?” In written feedback, I may want to focus on one question instead of a list, so as not to overwhelm the student with challenge. Notice that these questions are seeking additional information and pointing the student to make connections. Still the student may or may not take my suggestions to try something different. However, I argue this type of response is “saltier” than just settling for this response or telling them directly their approach isn’t as effective, and it may lead to further dialogue later on.

In a recent post, Aaron Richmond questions if well-intentioned metacognitive instruction can, in specific cases, be unethical (Richmond, 2018). John Draeger provides counterpoint in his response, but acknowledges the need to recognize and address possible adverse reactions to metacognitive instruction (Draeger, 2018). The B-ACE feedback framework both encourages student metacognition and is an expression of Ethical Teaching, summarized by Richmond (Richmond, 2018). It acknowledges students’ autonomy in their learning, seeks to avoid harm and promote their well-being, and strives to be unbiased and authentic. Further, it can address adverse reactions, by helping students to discover the deeper issues of their reaction.

In caring for our students, we want to see them grow. They aren’t always ready. Prochaske, Norcross, and DiClemente (1994) delineate six stages of change, and it starts with the lack of awareness and willingness to change. Change takes time an effort. Even so, let’s commit to making interactions with our students “salty”! Let’s gently, quietly, and persistently encourage them in their metacognitive development.

References

Prochaska, J., Norcross, J., & DiClemente, C. (1994). Changing for Good. New York: Harper Collins.

Tolman, A. & Kremling, J. (Eds.). (2017). Why Students Resist Learning: A Practical Model for Understanding and Helping Students. Sterling, VA: Stylus.

Acknowledgements

This blog post is based upon metacognition research supported by the National Science Foundation under Grant Nos. 1433757 & 1433645. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author and do not necessarily reflect the views of the National Science Foundation.


Metacognition is essential to ethical teaching

by John Draeger, SUNY Buffalo State

In his most recent post, Aaron Richmond considers the possibility that promoting metacognition might be unethical (Richmond, 2018). According to Richmond, ethical teaching requires promoting student autonomy by providing students with choices between learning strategies and promoting student welfare by safeguarding against harm. Richmond believes that activities promoting student metacognition may pose a potential threat to both student welfare and student autonomy. Thus, Richmond cautiously concludes that promoting student metacognition can be unethical.

Richmond illustrates his worry by considering the use of a metacognitive strategy that he has shared on our site (Richmond, 2017), namely Immediate Feedback Assessment Techniques (IF-AT). He worries that IF-AT can cause students undue anxiety, especially if they aren’t given the option of alternative assignments. In his view, the presence of anxiety threatens welfare and the lack of options threatens autonomy. To avoid these pitfalls, Richmond recommends that instructors tell their students why and how particular teaching strategies will be used. He also recommends that instructors be on the lookout for the possibility that a particular strategy could cause unintended anxiety. And he advises that instructors should be prepared to pre-warn students about the possibility of difficulty and be prepared to debrief students afterwards if difficulties occur. These safeguards are important because they protect student welfare and autonomy. I agree, though I argue below that metacognition is key to getting there. Richmond ends by posing three questions for us to think about. He asks, “Do you believe that some IwM practices have the potential to be unethical? If so, how do you ameliorate this issue? How do I become both an ethical and metacognitive teacher?” (Richmond, 2018). I will take each question in turn.

  1. Do I believe that some metacognitive practices have the potential to be unethical?

In short, no. It is possible that a metacognitive assessment, such as IF-AT, could inadvertently cause serious harm to a particular student. For example, a student facing serious psychological distress outside the classroom might find an assignment, any assignment, more than she can take. But the fact that a learning strategy could inadvertently harm a particular student does not show a strategy to be unethical. By analogy, there are many medical procedures that have been studied, approved, and shown to be effective. It is always possible that one those procedures could inadvertently cause a particular patient serious harm. Doctors ought to be aware of the possibility and monitor the situation. They should be ready with remedies. But the fact that someone could be inadvertently harmed neither shows that doctors are unethical nor that the procedure should be discontinued. Likewise, if a learning strategy has been tested and shown to be effective, then it seems reasonable to try. Instructors should be aware of the possibility that some students might have an adverse reaction. But the fact that a particular student can be inadvertently harmed neither shows that instructors are unethical nor that use of the learning technique should be discontinued.

It is also possible that a well-intentioned instructor could try a teaching innovation (e.g., IF-AT) in hopes that student learning will improve only to find that it doesn’t meet that objective. There are plenty of reasons to be concerned about ineffective instruction, but being unethical is far more than being ineffective, suboptimal, or even a cause for concern. On the analogy with medicine, a particular medical procedure may not help a particular patient or even a group of patients, but it is hard to see how doctors can be unethical for trying something that they believe could work. In both cases, we hope that teachers and doctors will become aware of the problems and look to make meaningful adjustments (i.e. become more metacognitive about their practices). In contrast, it is possible that instructors could be intentionally undermining student learning efforts. Such instruction could be unethical. But I doubt this applies to instructors taking the time to design activities that promote student metacognition in hopes of enhancing student learning.

Richmond’s concern about instruction implementing metacognitive learning strategies centers on whether they harm student welfare and undermine student autonomy. Returning to his illustration, Richmond worries that students may feel coerced into doing IF-AT (thus undermining choice) and the uniqueness of the activity may cause undue anxiety (thus undermining welfare). I don’t doubt that there are plenty of assignments and activities that students don’t want to do and these may stress them out. At some level, however, students have voluntarily opted into an educational system that will make demands on their time and energy, require hard work and dedication, and push their boundaries in order to facilitate their growth. Instructors should be mindful not to make unreasonable demands, but it is unclear how providing students with immediate feedback on their performance (IF-AT) constitutes coercion or any other unethical behavior. Moreover, I have argued that instructors should promote constructive discomfort in an attempt to nudge students towards learning growth (Draeger, 2014). More specifically in regards to IF-AT, it might be that a student feels anxiety associated with students learning that they don’t know as much as they thought they knew, but I suspect that these negative feelings will be offset by the positive feelings associated with improved performance.

In short, well-meaning learning strategies, including metacognitive ones, can be ineffective and in some cases can even inadvertently cause serious harm to specific students. But I see no reason to think this shows that instruction promoting student metacognition can be unethical.

  1. If so, how do you ameliorate this issue?

Though I don’t think that incorporating metacognition into one’s course is unethical, I do believe that it is the key to ameliorating the sorts of concerns Richmond is worried about. For example, Richmond hopes to raise awareness about the possible unintended consequences of well-meaning pedagogical best-practices. He rightly points out that we should not assume that good-intentions will carry the day. He argues for the importance of procedural safeguards when implementing assignments, such as being explicit about the purpose of an assignment, pre-warning students about pitfalls, and debriefing students afterwards. These safeguards could help promote student welfare. He argues for the value of giving students the choice between a variety of assignments. Offering multiple entry points into content both could improve student learning and increase student autonomy. This is good advice because it is a hallmark of good teaching.

I would venture to say that Richmond’s advice is a hallmark of good teaching because it is an example of metacognitive teaching. For example, if instructors should be mindful of student anxiety and discomfort, and use that awareness to guide their pedagogical choices, then promoting metacognition is how we get there. In this case, a metacognitive instructor would become aware of a student need (e.g., reduction of anxiety) and self-regulate by making the necessary adjustments (e.g., offering alternative assignments in order to reduce that anxiety). In my view, therefore, metacognition itself is the way to ameliorate Richmond’s concerns.

  1. How do I become both an ethical and metacognitive teacher?

Metacognition is not a magic wand that guarantees student success. Metacognitive instruction does, however, ask instructors to become increasingly aware of what works (and what doesn’t work) with an eye towards making adjustments that are likely to improve student learning. Metacognitive instructors can monitor roadblocks to learning and help students find ways to overcome them. It is possible that an assignment, such as IF-AT, might not help a particular group of students get where they need to go. If so, then a metacognitive instructor will monitor student progress, recognize that it is not working, and intentionally make a change. The instructor might decide that the assignment should be discontinued. In this case, however, the assignment would be discontinued because it was ineffective and not because it was unethical. In my view, it is metacognitive instruction that identifies the problem and proposes a solution.

In short, if the goal is to of promote awareness of student learning needs and promote the importance of making meaningful adjustments so that student needs are met, then it seems that metacognition is the key to both student welfare and student autonomy. And if, as Richmond argues, being ethical requires promoting welfare and autonomy, then metacognition is essential to the ethical teaching.

References

Draeger, J. (2014). “Cultivating the habit of constructive discomfort.” Retrieved from https://www.improvewithmetacognition.com/cultivating-a-habit-of-constructive-discomfort/

Richmond, A. (2018). “Can metacognitive instruction be unethical?” Retrieved from https://www.improvewithmetacognition.com/can-metacognitive-instruction-be-unethical/

Richmond, A. (2017). “Scratch and win or scratch and lose? Immediate Feedback Assessment Technique.” Retrieved from https://www.improvewithmetacognition.com/scratch-win-scratch-lose-immediate-feedback-assessment-technique/


Can Metacognition Instruction be Unethical?

By Aaron S. Richmond, Ph. D., Metropolitan State University of Denver

Many college and university teachers incorporate great metacognitive activities into our course work with the lofty goal of trying to improve the metacognition of our students. These activities include various assessment techniques (see Richmond, 2017, March; Was, 2014, August), instructional interventions (see Draeger, 2016, November), and course designs (see McCabe, 2018, March). But have we ever questioned whether these Improve with Metacognition (IwM) educational practices are ethical? In other words, when we do these great activities, assessments, or other techniques, are we implementing them in an ethical way?

The reason I embarked down this road was because I was having a conversation with one of my all-time favorite teachers (Doug Woody from the University of Northern Colorado) and we were discussing using active learning and metacognitive strategies in classroom instruction. He leaned over, mid-sentence, and said, “You know that sometimes, when done improperly, using those [metacognitive instruction] strategies may cause students to feel disrespected, out of control, cause feelings of distrust, and in some rare occasions cause harm.” I just looked back at him with shock, incredulity, and a creeping sense of horror. Incredulity because I felt that I was trying to do the best thing for the student, so how could that be bad. Shock, because I had never thought of it that way and a creeping feeling of horror because just maybe, maybe he could be right.

Ethical Teaching

But first, let me explain the nature of Ethical Teaching. Eric Landrum and Maureen McCarthy recently published the book Teaching ethically: Challenges and opportunities (2012). In their book, they discuss that ethical teachers focus on student’s respect and autonomy, nonmaleficience, beneficience, fidelity, and caring. Specifically, as teachers, we should allow students the right to make their own decisions (respect and autonomy), above all do no harm (nonmaleficience), promote our student’s welfare (beneficence), be fair, unbiased, and equal (justice), and be trustworthy and honest (fidelity). Thus, ethical teachers set out to improve their students’ learning by these guiding principles.

So how can IwM practices be potentially unethical? Again, when discussing this with Doug, my initial reaction was, “Of course not! I’m trying to IMPROVE my student’s metacognition. Which I KNOW will help them not only in my class, but throughout their college career.” However, upon reflecting and considering what it means to be an ethical teacher, it may be possible that implementing such IwM techniques improperly may in fact be unethical.

Let me illustrate. I’ve touted the use of Immediate Feedback Assessment Techniques (IF-AT) as a metacognitive tool for assessment (Richmond, 2017, February). IF-AT is used to instantaneously provide performance feedback to learners by allowing students to scratch off what they believe to be the correct answer on a multiple-choice exam, quiz or test. However, if implemented incorrectly, Can Metacognitive Instruction be Considered Unethical Teaching? Share on XIF-AT may cause to feel coerced (opposite of autonomy) into taking an assessment in this format that they don’t want to take or, more importantly that may cause them to do poorer than in other formats. For example, because IF-AT is so unique and takes some time to get to use to, students may feel that there is undue pressure on them to use this format without other options. A tenet of learner-centered pedagogy and ethical teaching is to provide options for students to choose from. Additionally, as I argued in the previous blog, using IF-AT, in some cases, may do more harm than good (opposite of nonmaleficience) if the format of IF-AT causes them anxiety and stress.  That is, most assessments do cause some anxiety and stress (which at low to moderate levels can be good for learning), however, IF-AT may cause students to experience exceptionally high levels of stress and anxiety and consequently decrease their performance.  Finally, the question then becomes whether IF-AT promotes student welfare (beneficence). Of course, we metacognitive teachers believe that this is why we are employing such strategies, but if harm is done, then it is definitely not beneficial to the students.

There are other examples of metacognitive activities that may be unethical (e.g., forcing students to do activities without prewarning them or giving them options not to participate), however, I think the silver-lining is that it may not be the activity itself, but rather how instructors implement these IwM activities.

How Can I Be an Ethical Teacher AND Improve with Metacognition?

Recently, my colleagues Regan Gurung and Guy Boysen and I (2016) tackled this very issue in our book on model college and university teaching. We suggested that there are several steps that teacher can take to be both ethical and metacognitive/model teachers. First, to provide respect and autonomy, we should let our students opt out of certain activities or give them alternatives (Richmond et al., 2016). For example, give students the option to take the IF-AT or a traditional formatted quiz. Second, to increase fidelity we should give forewarning on potential adverse or negative feelings or attitudes that may result when participating in an IwM activity. For example, with IF-ATs let your students know that you may get anxious when you realize that you missed the first two questions or if doing a metacognitive activity that puts certain students to a disadvantage (e.g., experiment of the use of elaboration vs. flash cards) let them know that it is intentionally designed in that way and it is not a reflection on their skills or abilities. To promote nonmaleficience, always discuss the purpose of your IwM activities. For example, discuss why you want to teach them various learning or memory strategies and why they should be beneficial. By doing this you are a more transparent teacher, which leads to what, I believe, being a metacognitive teacher embodies—promote beneficence by using effective IwM strategies that are known to work in many contexts (Richmond et al., 2016).

Concluding Thoughts and Questions for You

As illustrated, it may be possible that when we use IwM activities, we may be engaging in some unethical teaching practices. However, I think there are a few things that we can do which avoid this dilemma and much of it has to do with how IwM activities are implemented. Thus, I would like to conclude with a few questions that I hope you will take the time to answer and start a conversation on this important but often overlooked issue within IwM:

  1. Do you believe that some IwM practices have the potential to be unethical?
  2. If so, how do you ameliorate this issue?
  3. How do I become both an ethical and metacognitive teacher?

References

Draeger, J. (2016, November). Promoting metacognitive reading through Just-in-Time Teaching. Retrieved from https://www.improvewithmetacognition.com/promoting-metacognitive-reading-just-time-teaching/

Landrum, R., & McCarthy, M. A. (Eds.) (2012). Teaching ethically: Challenges and opportunities. Washington, D. C.: American Psychological Association

McCabe, J. (2018, March). Small metacognition—Part 1. Retrieved from https://www.improvewithmetacognition.com/small-metacognition-part-1/

Richmond, (2017, March). Joining forces: The potential effects of team-based Learning and immediate feedback assessment technique on metacognition. Retrieved from https://www.improvewithmetacognition.com/joining-forces-the-potential-effects-of-team-based-learning-and-immediate-feedback-assessment-technique-on-metacognition/

Richmond, A. S. (2017, February). Scratch and win or scratch and lose? Immediate feedback assessment technique. Retrieved from https://www.improvewithmetacognition.com/scratch-win-scratch-lose-immediate-feedback-assessment-technique/

Richmond, A. S., Gurung, R. A. R., & Boysen, G. (2016).  An evidence-based guide to college and university teaching: Developing the model teacher. New York, NY: Routledge.

Was, C. (2014, August). Testing improves knowledge monitoring. Improve with Metacognition. Retrieved from https://www.improvewithmetacognition.com/testing-improves-knowledge-monitoring/


Measuring Metacognitive Self-Assessment – Can it Help us Assess Higher-Order Thinking?

by Dr. Ed Nuhfer, California State Universities (retired)

Since 2002, I’ve built my “Developers’ Diary” columns for National Teaching and Learning Forum (NTLF) around the theme of fractals and six essential components in practice of college teaching: (1) affect, (2) levels of thinking (intellectual & ethical development), (3) metacognition. (4) content knowledge & skills, (5) pedagogy and (6) assessment. The first three focus on the internal development of the learner and the last three focus on the knowledge being learned. All six have interconnections through being part of the same complex neural networks employed in practice.

In past blogs, we noted that affect and metacognition, until recently, were deprecated and maligned by behavioral scientists, with the most deprecated aspect of metacognition being self-assessment. The highest levels of thinking discovered by Perry are heavily affective and metacognitive, so some later developmental models shunned these stages when only cognition seemed relevant to education. However, the fractal model advocates for practicing through drawing on all six components. Thus, metacognition is not merely important for its own merits; we instructors rely on metacognitive reflection to monitor whether we are facilitating students’ learning through attending to all six.

The most maligned components, affect and self-assessment may offer a key to measuring the overall quality of education and assessing progress toward highest levels of thinking. Such measurements have been something of a Grail quest for developers. To date, efforts to make such measures have proven to be labor intensive and expensive.

Measuring; What, Who, Why, and How?

The manifestation of affect in the highest Perry stages indicates that cognitive expertise and skills eventually connect to affective networks. At advanced levels of development, experts’ affective feelings are informed feelings that lead to rapid decisions for action that are usually effective. In contrast, novices’ feelings are not informed. Beginners’ approaches are tentative and take a trial-and-error approach rather than an efficient path to a solution. By measuring how well students’ affective feelings of their self-assessed competence have integrated with their cognitive expertise, we should be able to assess their stage of progress toward high-level thinking.

To assess a group’s (a class, class rank or demographic category) state of development, we can obtain the group’s mean self -assessments of competence on an item-by-item basis from a valid, reliable multiple-choice test that requires some conceptual thinking. We have such a test in the 25-item Science Literacy Concept Inventory (SLCI). We can construct a knowledge survey of this Inventory (KSSLCI) to give us 25 item-by-item self-assessed estimates of competence from each participant.

As demonstrated in 2016 and 2017, item-by-item averages of group responses attenuate the random noise present in individuals’ responses. Thus, assessments done by using aggregate information from groups can provide a clear self-assessment signal that allows us to see valid differences between groups.

If affective self-assessed estimates become increasingly informed as higher level thinking capacity develops, then we should see that the aggregate item-by item paired measures correlate with increasing strength as groups gain in participants who possess higher order thinking skill. We can indeed see this trend.

Picture the Results

For clear understanding, it is useful first to see what graphs of paired measures of random noise (meaningless nonsense) look like (Figure 1A) and how paired measures look when they correlate perfectly (Figure 1B). We produce these graphs by inputting simulated data into our SLCI and KSSLCI instruments (Figure 1).

Random nonsense produces a nearly horizontal line along the mean (“regression to the mean”) of 400 random simulated responses to each of the 25 items on both instruments. The best-fit line has values of nearly zero for both correlation (r) and line slope (Figure 1A).

We use a simulated set of data twice to get the pattern of perfect correlation when the participants’ mean SLCI and KSSLCI scores for each item are identical. The best-fit line (Figure 1B has a correlation (r) and a line slope, both of about unity (1). The patterns from actual data (Figure 2) will show slopes and correlations between random noise and perfect order.

Fig 1 Nuhfer Modeling Correlational Patterns

Figure 1. Modeling correlational patterns with simulated responses to a measure of competence (SLCI) and a measure of self-assessed competence (KSSLCI). A shows correlational pattern if responses are random noise. B shows the pattern if 400 simulated participants perfectly assessed their competence.

Next, we look at the actual data obtained from 768 novices (freshmen and sophomores—Figure 2 A). Novices’ self-assessed competence and actual competence have a significant positive correlation. The slope is 0.319 and r is .69. The self-assessment measures explain about half of the variance (r2) in SLCI scores. Even novices do not appear to be “unskilled and unaware of it.” Developing experts (juniors, seniors and graduate students, N = 831 in Figure 2B) produce a fit line with a slightly steeper slope of 0.326 and a stronger r of .77. Here, the self-assessment measures explain about 60% of the variance in the Inventory scores.

When we examine experts (109 professors in Figure 2C), the fit line steepens to a slope of 0.472, and a correlation of r = .83 explains nearly 70% of the variance in Inventory scores. The trend from novice to expert is clear.

Final Figure 2D shows the summative mean SLCI scores and KSSLCI ratings for the four undergraduate ranks plus graduate students and professors. The values of KSSLCI/SLCI increase in the order of academic rank. The correlation (r) between the paired measures is close to unity, and the slope of 0.87 produces a pattern very close to that of perfect self-assessment (Figure 1B).

Fig 2 Nuhfer SLCI data

Figure 2: Correlations from novice to expert of item-by-item group means of each of the 25 items addressed on the KSSLCI and the SLCI. Panel A contains the data from 768 novices (freshmen and sophomores). B consists of 831 developing experts (juniors, seniors and graduate students). C comes from 109 experts (professors). Panel D employs all participants and plots the means of paired data by academic rank. We filtered out random guessing by eliminating data from participants with SLCI scores of 32% and lower.

Figure 2 supports: that self-assessments are not random noise, that knowledge surveys reflect actual competence; that affective development occurs with cognitive development, and that a group’s ability to accurately self-assess seems indicative of the group’s general state of intellectual development.

Where might your students might fall on the continuum of measures illustrated above? By using the same instruments we employ, your students can get measures of their science literacy and self-assessment accuracy, and you can get an estimate of your class’s present state of intellectual development. The work that led to this blog is under IRB oversight, and getting these measures is free. Contact enuhfer@earthlink.net for further information.


Embedding Metacognition into New Faculty Orientation

By Lauren Scharff, Ph.D., U. S. Air Force Academy *

When and how might faculty become aware of metacognition in general, how student metacognition might enhance student learning, and how personal metacognition might enhance their own teaching? Ideally, faculty learned about metacognition as students and thereafter consciously engaged in metacognitive practices as learners and developing professionals. Based on conversations with many faculty members, however, this is not the case. It certainly wasn’t the case for me. I don’t remember even hearing the term metacognition until after many years of working as a professor. Even now most new faculty seem to only have a vague familiarity with the term “metacognition” itself, and few claim to have spent much time considering how reflection and self-regulation, key components of metacognition, should be part of their own practice or part of the skill set they plan to help develop in their students.

While this reality is not ideal (at least for those of us true believers in the power of metacognition), realization of this lack of understanding about metacognition provides opportunities for faculty development. And why not start right at the beginning when faculty attend new faculty orientation?

New Faculty Orientation

At my institution this summer, we did just that. Our Director of Faculty Development, Dr. Marc Napolitano, worked the topic into his morning session on student learning. We designed a follow-on, small-group discussion session that encouraged faculty to actively engage in reading, personal application, and discussion of metacognition.

The reading we chose was one of my favorite metacognition articles, Promoting Student Metacognition, by Dr. Kimberly Tanner (2012). The session was only 40 minutes, so we only had them read a few pages of the article for the exercise, including her Table 1, which provides a series of questions students can ask themselves when planning, monitoring, evaluating their learning for a class session, while completing homework, while preparing for an exam. We had the new faculty jot down some reflections based on their responses to several guided prompts. Then we had time for discussion. I facilitated one of the small groups and was thus able to first-hand hear some of their responses.

Example questions:

  • What type of student were you as an undergraduate? Did you ever change your approach to learning as you went through school?
  • You obviously achieved success as an undergraduate, but do you think that you could have been more successful if you had better understood the science of learning and had teachers incorporate it into their courses?
  • If you had to share a definition of metacognition [from the reading] with students – and explain to them why it is an essential practice in learning – which definition would you use and how would you frame it with students?
  • If you wished to incorporate metacognition into your class, what approach(es) currently seems most practical for you? Why?
  • Which 3-4 of the questions in Table 1 seem like they would most helpful to use in your class? Why do these questions stand out, and how might they shape your class?

The discussion following the reading and reflection time was very rich. Only one member of my group of eight reported a good prior understanding of metacognition and how it could be incorporated into course design (she had just finished a PhD in physics education). Two others reported having vague prior familiarity with the term. However, after participating in these two faculty development sessions, all of them agreed that learning about the science of learning would have been valuable as a student regardless of level (K-12 through graduate school).

The faculty in my group represent a wide variety of disciplines, so the ways of incorporating metacognition and the questions from the table in the reading that most appealed to them varied. However, that is one of the wonderful things about designing courses or teaching practices to support student metacognition – there are many ways to do so. Thus, it’s not a problem to fit them to your way of teaching and your desired course outcomes.

We also spent a little time discussing metacognitive instruction: being aware of their choices as instructors and their students’ engagement and success, and using that awareness to guide their subsequent choices as instructors to support their students’ learning. They quickly understood the parallels with student metacognitive learning (students being aware of their choices and whether or not those choices are leading to success, and using that awareness to guide subsequent choices related to their learning). Our small groups will continue to meet throughout the coming year as a continuation of our new faculty development process. I look forward to continuing our conversations and further supporting them in becoming metacognitive instructors and promoting their students’ development as metacognitive learners.

————

Tanner, K. (2012). Promoting student metacognition. CBE—Life Sciences Education; Vol. 11, 113–120

* Disclaimer: The views expressed in this document are those of the author and do not reflect the official policy or position of the U. S. Air Force, Department of Defense, or the U. S. Govt.


Helping Students Feel Responsible for Their Learning

by Patrick Cunningham, Ph.D., Rose-Hulman Institute of Technology

“Dr. C, you really expect your students to do a lot!” I quickly replied, “Yes!” We then engaged in a discussion of things only students can do for their learning. How can we help more of our students recognize their responsibility for their learning? Three strategies I employ include explicit and direct instruction, questioning for self-discovery, and in-class opportunities to practice new learning strategies. Each of these strategies direct students’ focus to things under their control.

Helping our students recognize and embrace their responsibility for their learning requires metacognitive activity. Specifically, it requires building metacognitive knowledge of persons and strategies and engaging in metacognitive regulation through planning for and monitoring learning experiences. Direct instruction and in-class learning strategy practice can expand metacognitive knowledge. Questioning for self-discovery can facilitate students metacognitive monitoring and planning for subsequent learning experiences.

For explicit and direct instruction, I start a discussion within the first two days of class by asking, “What does it mean to learn something?” Most responses include applying and explaining concepts. Good answers, but I press for more depth. In turn I respond, “Apply to what? Explain to whom?” Learning something, they say, means being able to apply concepts to real circumstances. My engineering students also come up with a variety of people or groups of people to explain things to: their grandmother, family members, a cross-functional design team, a boss, peer engineers, marketing/sales professionals, or even customers. These answers are good operational definitions of learning. Next, I talk to my students about the knowledge frameworks that underlie these abilities.

Illustration of Knowledge Frameworks

In order to apply concepts to real and diverse circumstances and to explain concepts effectively to a range of audiences we must have many routes to and between the elements of our knowledge and a logical structure of the information. That is, our knowledge frameworks must be well populated, richly interconnected, and meaningfully organized (Ambrose et al., 2010). However, as novices in an area, we start with sparsely populated and isolated knowledge frameworks. I then share with students that they are the only ones who can construct their knowledge frameworks. The population and interconnection of elements depends on what they individually do with the material, in class and out of class. As the instructor, I can create opportunities and experiences for them, but I cannot build their knowledge frameworks for them. Students are responsible for the construction work.

For self-discovery I use guiding questions to help students articulate learning goals, combat the Illusion of Comprehension, and make cause-and-effect linkages between their learning behaviors and outcomes. I may ask, “What goals do you have for your homework/study sessions?” Students often focus on getting assignments done or being “ready” for exams, but these are not directly learning goals. It is helpful here to ask what they want or need to be able to do with the information. Eliciting responses such as: “Apply ____ to ____. Create a ____ using ____. Explain ____.” Now we can ask students to put the pieces together. How does just “getting the homework done” help you know if you can apply/create/explain? We are seeking to help students surface incongruities in their own behavior, and these incongruities are easier to face when you discover them yourself rather than being told they are there.

A specific incongruity that many students struggle with is the Illusion of Comprehension (Svinicki, 2004), which occurs when students confuse familiarity with understanding. It often manifests itself after exams as, “I knew the material, I just couldn’t show you on the exam.” My favorite question for this is, “How did you know you knew the material?” Common responses include looking over notes or old homework, working practice exams, reworking examples and homework problems. But what does it mean to “look over” prior work? How did you work the practice exam? How did you elaborate around the concepts so that you weren’t just reacting to cues in the examples and homework problems? What if the context of the problem changes? It is usually around this point that students begin to realize the mismatch between their perceptions of deep understanding and the reality of their surface learning.

Assignment or exam wrappers are also good tools to help students work out cause-and-effect linkages between what they do to learn material and how they perform. In general, these “wrappers” ask students to reflect on what they did to prepare for the assignment or exam, process instructor feedback or errors, and adjust future study plans.

It is important, once we encourage students to recognize these incongruities, that we also help direct students back to what they can do to make things better. I direct conversations with my students to a variety of learning strategies they can employ, slanted towards elaborative and organizational strategies. We talk about such things as making up problems or questions on their own, explaining solutions to friends, annotating their notes summarizing key points, or doing recall and reviews (retrieval practice).

However, I find that telling them about such strategies often isn’t enough. We trust what is familiar and comfortable – even ineffective and inefficient learning strategies that we have practiced over years of prior educational experiences and for which we have been rewarded. So I implement these unfamiliar, but effective and efficient strategies into my teaching. I want my students to know how to do them and realize that they can do them in their outside of class study time as well.

One way I engage students with new strategies is through constructive review prior to exams. We start with a recall and review exercise. I have students recall as many topics as they can in as much detail as they can for a few minutes – without looking anything up. Then I have students open their notes to add to and refine their lists. After collectively capturing the key elements, I move to having pairs of students work on constructing potential questions or problems for each topic. I also create a discussion forum for students to share their problems and solutions – separately. As they practice with each others’ problems, they can also post responses and any necessary corrections.

In concert, direct instruction, questioning for self-discovery, and in-class opportunities to practice new learning strategies can develop our students’ sense of responsibility for their learning. It even can empower them by giving them the tools to direct their future learning experiences. In the end, whether they recognize it or not, students are responsible for their learning. Let’s help them embrace this responsibility and thrive in their learning!

References

Ambrose, S., Bridges, M., DiPietro, M., Lovett, M., & Norman, M. (2010) How Learning Works: 7 Research-Based Principles for Smart Teaching. San Francisco, CA: Jossey-Bass.

Svinicki, M. (2004). Learning and Motivation in the Postsecondary Classroom. San Francisco, CA: John Wiley & Sons.

Acknowledgements

This blog post is based upon metacognition research supported by the National Science Foundation under Grant Nos. 1433757 & 1433645. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author and do not necessarily reflect the views of the National Science Foundation.

Next Blog-post:

Overcoming student resistance to engaging in their metacognitive development.


Metacognitions About a Robot

by Roman Taraban, Ph.D., Texas Tech University

Imagine a time when intelligent robots begin interacting with humans in sophisticated ways. Is this a bit farfetched? Probably not, as there already exist compelling examples of just that. Sophia, a robot, so impressed her Saudi audience at an investment summit in 2017 that she was granted Saudi citizenship. Nadine, another robot, is an emotionally intelligent companion whose “intelligent behavior is almost indistinguishable from that of a human”. The coming exponential rise of artificial intelligence into all aspects of human behavior requires a consideration of possible consequences. If a machine is a billion times more intelligent than a human, as some predict will happen by 2045, what will cognitive and social interactions with such superhuman machines be like? Chris Frith (2012) argues that a remarkable human capacity is metacognition that concerns others. However, what if the “other” is an intelligent machine, like a robot. Is metacognition about a robot feasible? That is the question posed here. Four aspects of metacognition are considered: the metacognitive experience, theory of mind, teamwork, and trust. Other aspects could be considered, but these four should be sufficient to get a sense of the human-machine metacognitive possibilities.

robot and human hand fist bump

Flavell (1979) defined metacognitive experiences as follows: “Metacognitive experiences are any conscious cognitive or affective experiences that accompany and pertain to any intellectual enterprise. An example would be the sudden feeling that you do not understand something another person just said” (p. 906). Other examples include wondering whether you understand what another person is doing, or believing that you are not adequately communicating how you feel to a friend. We can easily apply these examples to intelligent machines. For instance, I might have a sudden feeling that I did not understand what a robot said, I might wonder if I am understanding what a robot is doing, or I may believe that I am communicating poorly with the robot. So it appears to be safe to conclude that we can have metacognitive experiences involving robots.

Other instances of metacognition involving intelligent machines, like robots, are problematic. Take, for instance, mentalizing or Theory of Mind. In mentalizing, we take account (monitor) of others’ mental states and use that knowledge to predict (control) others’ and our own behavior. In humans, the ability to reason about the mental states of others emerges between the ages of 4 to 6 years and continues to develop across the lifespan. In a typical test of this ability, a child observes a person place an object in drawer A. The person then leaves the room. The child observes another person move the object to drawer B. When the first person returns, the child is asked to predict where the person will look for the object. Predicting drawer A is evidence that the child can think about what the other person believes, and that the child recognized that the other person’s beliefs may not be the same as the child’s own knowledge. Theory of mind metacognition directed towards humans is effective and productive; however, theory of mind metacognition directed to intelligent machines is not likely to work. The primary reason is that theory of mind is predicated on having a model of the other person and being able to simulate the experience of the other person. Because intelligent machines process information using algorithms and representations that differ from those humans use, it is not possible to anticipate the “thinking” of these machines and therefore predict their behavior in a metacognitive manner, i.e., having a theory of the other mind. Presently, for instance, intelligent machines use deep learning networks and naïve Bayes algorithms to “think” about a problem. The computational methods employed by these machines differ from those employed by humans.

What about teamwork? According to Frith (2012), humans are remarkable in their ability to work together in groups. Teamwork accounts for humans’ incredible achievements. The ability to work together is due, in large part, to metacognition. The specific factor cited by Frith is individuals’ willingness to share and explain the metacognitive considerations that prompted their decision-making behavior. For group work to succeed, participants need to know the goals, values, and intentions of others in the group. As has been pointed out already, machine intelligence is qualitatively different from human knowledge, so that is one barrier to human-machine group work. Further, the benefits of group work depend on a sense of shared responsibility. It is currently unknown whether or how a sense of cooperation and shared responsibility would occur in human-machine decision making and behavior.

There is one more concern related to machine intelligence that is separate from the fact that machines “think” in qualitatively different ways compared to humans. It is an issue of trust. In some cases of social interaction, understanding information that is being presented is not an issue. We may understand the message, but wonder if our assessment of the source of the information is reliable. Flavell (1979) echoed this case when he wrote: “In many real-life situations, the monitoring problem is not to determine how well you understand what a message means but to determine how much you ought to believe it or do what it says to do” (p. 910). When machines get super smart, will we be able to trust them? Benjamin Kuipers suggests the following: “For robots to be acceptable participants in human society, they will need to understand and follow human social norms.  They also must be able to communicate that they are trustworthy in the many large and small collaborations that make up human society” https://vimeo.com/253813907 .

What role will metacognitions about super-intelligent machines have in the future? Here I argue that we will have metacognitive experiences involving these machines. Those experiences will occur when we monitor and regulate our interactions with the machines. However, it is not clear that we will be able to attain deeper aspects of metacognition, like theory of mind. This is because the computations underlying machine intelligence are qualitatively different from human computation. Finally, will we be able to trust robots with our wealth, our children, our societies, our lives? That will depend on how we decide to regulate the construction, training, and deployment of super intelligent machines. Flavell (1979) often brings in affect, emotion, and feelings, into the discussion of metacognitive experiences. Kuipers emphasizes the notion of trust and ethics. These are all factors that computer scientists have not begun to address in their models of intelligent machine metacognition (Anderson & Oates, 2007; Cox, 2005). Hopefully solutions can be found, in order to enable rich and trustworthy relationships with smart machines.

References

Anderson, M. L., & Oates, T. (2007). A review of recent research in metareasoning and metalearning. AI Magazine28(1), 12.

Cox, M. T. (2005). Field review: Metacognition in computation: A selected research review. Artificial intelligence169(2), 104-141.

Flavell, John H. (1979). Metacognition and cognitive monitoring. American Psychologist, 34(10), 906-911.

Frith, C. D. (2012). The role of metacognition in human social interactions. Philosophical Transactions of the Royal Society B: Biological Sciences, 367(1599), 2213-2223.


Using Metacognition to Support Graduating High School Seniors with a LD to Prepare and Transition Successfully to College (Part II)

by Mary L. Hebert, PhD
Campus Director, The Regional Center for Learning Disabilities,
Fairleigh Dickinson University

High school commencement ushers forth connotations of caps and gowns, goodbyes to four years of familiar teachers, friends, routine, challenges and successes. While the focus seems to be on completing a phase of one’s life, commencement actually means a beginning or a start. With high school now a chapter completed, the summer months will be spent preparing for the transition to college. ALL students entering college will have similar adjustments. Students with a history of a learning disability however, may benefit from a purposeful, strategic, or more metacognitive plan for the transition.

Transition and Related Feelings

Students who have had a 504 or Individualized Education Plan (IEP) during their k-12 years, may face concerns that are similar to other students, yet have a heightened sensitivity to things such as academic performance, managing the pace and independence of college life, leaving behind supports and resources that have been familiar and helpful, and wondering where and if resources at college will be available and/or helpful. They will have similar concerns about making new friends like any first year student, but this may be heightened in particular if a student has had social challenges that have accompanied their LD. Students with a history of LD will often express the challenge of finding balance of work, study, time to relax and be social. Findings by Hall and Webster (2008) indicate that college students with LD indicate self-doubt about being able to perform as well as their non-LD college peers. Encouraging an active preparation to foster self-awareness and building strategies of approach will enrich the metacognitive preparation.

In this post, I will continue my series on how we can use metacognitive practices to support LD students during this transition time (see also Part I). Here I will focus on three key areas including academics, social interactions, and finding balance. Prompts in the form of questions are suggested for each area. Metacognition encourages the enrichment of self-awareness through prompts and reflection to create high level critical thinking and concepts that one can apply to a situation and how one functions.

I propose that metacognition can be applied before day one at college and hopefully assist with a more metacognitive approach to the transition prior to stepping onto campus.

Academics:

Most students ponder how college will be different than high school. Students with learning disabilities frequently ponder this more so. College academics will be different. Typically students experience the differences in coursework to be in regard to the degree of independence in preparing and mastering the material and the pace. Students can be encouraged to converse and even better, to list their reflections to prompts which will increase self-awareness about the differences they anticipate and what strategies they might apply to prepare to respond to managing the differences (i.e. encourage metacognition). Prompts that parents, teachers, tutors, and others familiar with the student can consider may include;

  • How do you think classes will be different in college?
  • What strategies have you learned in high school that you will bring to college?
  • What areas do you still have a hard time with?
  • What resources will there be in college that can help you with these areas?
  • Have you looked on your college website or reached out for more information for resources you will reach out to for support?
  • Is there a program on your campus that specifically responds to the needs of students with LD and are do you intend to reach out to this resource?

Supporting a student in answering and reflecting on these prompts will promote a more metacognitive awareness and ultimately help create a plan for the academic tasks of college. It is the student who is least prepared about the differences between high school and college who may face the most difficulty during the transition. Preparation prevents perspiration and is key to the transition.

Social:

If there were one particular common denominator for transitioning first year students, it is the adjustment to their new social arena on campus. No matter who he or she has been friends with or how many or few, they will need to build a new social circle. Supporting an incoming Freshmen to think about and anticipate changes and choices they will have to make will help them adjust and ponder what is going to be important and a priority for them in the adjustment to their social life at college. In preparation to take on the tasks of social adjustment the goal is to enhance the awareness of what skills will be needed to connect with new friends,

For one’s anticipated social adjustment a person familiar and supportive to the student can prompt the student to respond to the following…

  • How have I been successful in my relationships with peers and authority figures in the past?
  • Where have I had challenges?
  • What two areas do I think need to change?
  • How will these improve how I manage socially?
  • What activities or interests do I have that may be areas I pursue in college clubs or organizations?
  • What resources does my new college have that I can use to help me in making social connections?

These and other prompts can channel past experience into helpful reflection, which will not only help a student organize and reflect on challenges in this arena, but also highlight successes and strengths so that these can become a part of a strategy or plan they can put in their college transition ‘toolbox.’

Balance:

Balance is key for us all and truly a never-ending endeavor; however during the first year it is particularly challenging to establish that balance. Students with LD often have a history of structured support in tackling academics, time management, sleep, recreation, etc. College life will usher in a new life of finding a balance more independently. Time management as well as being adequately organized are two of the most commonly discussed issues. They are key factors toward success as well as factors that interfere with it as well. Encourage your student to once again reflect on some prompts to encourage metacognitive reflection and promote a plan of approach. Consider the following:

  • What is your plan for keeping track of your course work and other commitments (social, clubs, appointments etc)? A traditional planner book? A digital planning system?
  • What efforts to stay organized have worked in the past? Why/why not?
  • What has not worked in the past? Why/why not?
  • How will you fit in sleep, wellness needs, recreation, and other commitments with school work?
  • What will be challenging in doing this?
  • What will be the red flags you are having a hard time finding a balance?
  • What will be your plan of action if you are having a hard time with the balance of college life?
  • What will be your go to resources on campus and off campus to support you in finding balance?

In conclusion, supportive prompts and reflection will promote awareness, critical thinking, and purposeful planning for these issues in the transition to college. Doing so prior to day one of college is helpful, but it can also be continued as the student enters college and embraces the new realities of college life.

Understanding how one approaches academics is particularly important for a student with a learning disability. This will be key for college wellness and help them navigate the transition. By applying metacognition, the student can be encouraged to not only think about their thinking about these concepts of academics, social development and finding balance but also to discern strategies to apply and increase the value of their perception of capacity to self-manage the challenges ahead. With these skills in hand, self-advocacy is heightened, which is a key element of success for college students with learning disabilities.

Hall, Cathy W. and Raymond E. Webster (2008). Metacognitive and Affective Factors of College Students With and Without Learning Disabilities. Journal of Postsecondary Education and Disability. 21 (1)


Academic Advising Tools through a Metacognitive Lens

Heather Mitchell, Associate Professor Psychology Webster University hmitchell33@webster.edu; Kim Kleinman, Director Undergraduate Advising Webster University kleinman@webster.edu; & Ronald Daniel, Director London Webster University ronalddaniel93@webster.edu

Background / Motivation

Metacognitive practices in advising have been documented (e.g., Freeman, 2008; Sullivan-Vance, 2008), and the literature is full of suggestions on how to support purposeful student engagements and learning outcomes through advising (e.g., Campbell & Nutt, 2008; Willcoxson & Wynder, 2010). For example, academic advising is a type of student learning, and we know metacognitive approaches are one best practice to improve such learning. By infusing academic advising with such metacognitive tools, we enhance intentional student engagement through the process of advising.

Why do I have to take this class? How is this requirement going to benefit me? When will I ever use this information again? Comments similar to these three questions led us to develop the advising syllabus and curriculum planner as tools to use in academic advising. Purposeful advising is a critical component of higher education as we prepare students to be responsible, global citizens in the 21st century. Additionally, metacognition can be an extremely useful tool in an effort to promote student achievement.

Nuts and Bolts / Method

This report provides a brief overview of two advising practices (i.e., advising syllabi and curriculum planners) we use to help deliver successful, engaging experiences for students. The advising syllabus and planner are both metacognitive in nature and thus can help each student and advisor remain intentional and reflective of the student’s college career. Webster University’s advising center and individual faculty in the college refine as needed the advising syllabus for use with students. Additionally, curriculum planners, or the “Planner”, originally developed at Virginia Tech University, provide a helpful organization tool for students to use while laying out their academic path in higher education. Students at Webster University’s Geneva, Switzerland campus and students at Webster’s St. Louis campus have benefitted from these tools.

Advising Syllabus. Such a syllabus includes an advising mission or statement/ philosophy of advising and allows advisors to outline any expectations and responsibilities for both students and advisors. See Appendix A: Undergraduate Advising Syllabus as an example. Learning outcomes and a timeline / calendar of advising events are key components of such a syllabus as well as a list of resources an advisee may find useful. Advising is an essential component of an educational mission, and an advising syllabus helps specify the importance of advising similar to the way course syllabi are a regular part of every student’s classroom education. Individual advisors, either professional advisors from the University’s Advising Center or individual faculty advisors personalize the specific criteria, descriptions, learning outcomes, and responsibilities for their advisees.

The Planner. Both paper and online versions of the Planner have been created. Computer science students at Virginia Teach developed the online version of the Planner as a way of “saving” the first draft of their holistic academic plan including curricular and extra-curricular components. Both an individual student and their advisor must provide the specific knowledge and details concerning career information, interests, and plans such as graduate school, technology competences, and language competences. Students now commonly use paper versions of the Planner, which requires crucial information such as student’s course and activity interests as well as details about the availability of those courses and/or activities. See Appendix B: Planner Template.

Specifically, the Planner provides students with an opportunity to “map out” their remaining time, requirements, and other activities so students can make the most of their college experience. Students are asked to review the requirements for obtaining their specific degree and they are provided with various resources (through web links and/or Advising Worksheets appropriate to their major) to use in order to create their Planner. To complete the Planner students must include the courses taken as well as those they plan to take in their academic career. In addition to coursework, students should include any other experiences relevant to their own professional development (e.g., volunteer opportunities, research involvement, study abroad, or internships). Use of the Planner is certainly varied (similar to use of the Advising Syllabus). Advisors encourage students to plan, monitor, and evaluate their academic and co-curricular progress with these planners. Students also should include on the Planner when they might begin to search for and apply to jobs, graduate programs, etc.

Outcomes / Lessons Learned

Formal investigations of these tools have not been conducted; however, anecdotal evidence suggests students and faculty have found these tools beneficial. For example, when reflecting on The Planner, one student commented in their course evaluation [the Planner] “is a great opportunity to identify, goals and get a really actionable plan in place to achieve it.” Students appear to benefit from these tools most when students are provided ample time to understand, create, and appropriately adjust the specific mechanisms of both tools. These tools are ideal when engaged as iterative experiences. Specifically, advisors first provide appropriate scaffolding to advisees by introducing these tools. Additionally, the metacognitive nature of both tools allows students to move beyond knowing and understanding their academic requirements to analyzing, evaluating, and creating their plan to meet such requirements. In other words, the tools reflect a change from the bottom, or lower level, Bloom’s taxonomy skills to the top, or higher level, skills. The level of specific metacognitive guidance each advisor provides advisees is also completely variable as neither of these tools are mandatory. Both tools simply provide a metacognitive lens for both advisors and students to view the advising process.

References

Campbell, S.M., & Nutt C.L. (2008). Academic advising in the new global century: Supporting student engagement and learning outcomes achievement. Peer Review, 10, 4-7.

Freeman, L.C. (2008). Establishing effective advising practices to influence student learning and success. Peer Review,10, 12-14.

Sullivan-Vance, K. (2008). Reenvisioning and revitalizing academic advising at Western Oregon University. Peer Review,10, 15-17.

Wilcoxson, L., & Wynder, M. (2010). The relationship between choice of major and career, experience of university and attrition. Australian Journal of Education, 54, 175-189.


Facilitating Student Success through Awareness of One’s Own Study Habits

Randi Shedlosky-Shoemaker, York College of Pennsylvania, rshedlos@ycp.edu

Background

As an academic advisor for new college students, I often see them struggle to understand the demands of high school versus college, particularly in terms of self-regulation. Professors generally expect their students to complete work outside of class, including assigned and self-initiated activities (e.g., reviewing material). Given students’ experience with homework during their K-12 experience, professors and advisors may be tempted to presume college students are well practiced at completing such work and already know how to regulate related behaviors. Although research suggests that self-monitoring and knowledge of how students learn improves as they age (Brown & Smiley, 1977; Pressley, Levin, & Ghatala, 1984), college students can still struggle with this metacognitive skill (Pressley & Ghatala, 1990). The impaired metacognition may be in part due to lack of transition training.

In high school, it was likely easier for students to determine what work they had to do because someone told them what to do. College presents a different environment; perhaps for the first time, students become largely—if not solely—responsible for regulating their studying. Although students may still have assigned homework, they also have to decide what course materials to read, what to study and how, and what and when to review. This leap from being highly guided by others to being responsible for regulating their own studying can present a challenge to college students, particularly when no intermediate steps help scaffold students’ learning of self-regulation and metacognitive skills. To assist students in understanding their own study habits, I sought out to examine what role weekly study reports could play in students’ overall academic performance.

Method

I recruited undergraduate students enrolled in a mid-sized private four-year college to participate in a semester-long study on study habits. Interested students could complete up to 12 weekly study reports (adapted from Bembenutty & White; 2013; see Appendix). Through the online report, students recorded what assigned and self-initiated work they completed during the past week. Students also completed a survey at the beginning and end of the semester, measuring their feelings of motivation related to their courses (items adapted from intrinsic motivation inventory, including the choice, tension, effort, enjoyment, and value subscales; Ryan, 1982) and other academic factors (e.g., high school GPA, cumulative college GPA, credits enrolled during the current semester). In the final survey, students assessed their experience using the study reports. Students who completed both surveys and submitted at least eight weekly reports were entered into a raffle to win a gift card to the college bookstore.

Outcomes/Lessons Learned

Of the 77 students I observed during the semester, most (n = 64, 83%) submitted at least one of the 12; 36 students (47%) completed at least eight reports and 10 students (13%) submitted all 12 reports. Cumulative GPA prior to that semester was unrelated to the number of reports submitted, r(72) = 0.15, p = .21, as was academic standing based on earned credits, r(77) = -.12, p = .29. Motivation measures were unrelated to the number of reports submitted, except for choice: Students who felt they had more choice in selecting their classes also submitted more reports, r(75) = .26, p = .02.

To examine the relation to academic performance that semester, I conducted a multiple regression analysis, including previous cumulative GPA (i.e., prior to the start of the study), number of credits earned (i.e., academic standing), perceived choice in taking their courses, and number of study reports completed as potential predictors. Only two of the variables predicted semester GPA: previous cumulative GPA (B = 0.73, t = 9.54, p < .001) and number of reports submitted (B = 0.19, t = 2.33, p = .02). As previous academic success was accounted and did not predict how many study reports students completed, it seems unlikely that the positive relation between number of reports completed and semester GPA could merely be attributed to a “good student” effect, which might suggest that good students were more inclined to complete the reports, as well as engage in behaviors that improved their academic performance. Finally, none of the measures of motivation predicted whether students completed the weekly reports.

Among the 41 students who completed the final survey, most students indicated that the reports were helpful (n = 34, 83%) and felt they gained useful insight by completing the reports (n = 30, 73%). Among the open-ended remarks, students noted that the reports helped them realize how much work they were (or were not) doing and how much time tasks/classes required. Several students remarked that they had learned about their own study habits, including common distractors they struggled with, inadequate strategies they used, and their own failures in time management. In light of students’ remarks, the reports appeared to provide an opportunity to regularly and explicitly think about their own study habits. In doing so, students may develop improved metacognition related to their own learning.

If employed more intentionally as a learning tool, instructors and advisors could use the reports to go beyond helping students develop a heightened awareness of their study behaviors. Advisors could have new students or students who are struggling academically maintain a journal of their studying during a set period of time and then provide individualized suggestions to the student in an informal one-on-one conversation. Expecting students to have an actual record means that the advisor and student are not relying on autobiographical introspection of a student’s behaviors to understand current problems or develop future studying plans. In a class, particularly for courses that address effective learning strategies as a student learning outcome (e.g., first-year seminar courses, major orientation courses), instructors could create more structured engagement with the tool by not only having students record behaviors but also identify patterns of behavior over time and make evidence-based plans for future studying behaviors. Such a strategy offers an opportunity to incentivize completing the reports (e.g., course credit) and could be a valuable stepping stone for students as they transition from highly-guided learner to self-directed learner. 

References

Bembenutty, H., & White, M. C. (2013). Academic performance and satisfaction with homework completion among college students. Learning and Individual Differences, 24, 83-88. doi: 10.1016/j.lindif.2012.10.013

Brown, A. L., & Smiley, S. S. (1977). Rating the importance of structure units of prose passages: A problem of metacognitive development. Child Development, 48, 1-8. doi: 10.2307/1128873

Pressley, M., & Ghatala, E. S. (1990). Self-regulated learning: Monitoring learning from text. Educational Psychologist, 25, 19-33. doi: 10.1207/s15326985ep2501_3

Pressley, M., Levin, J. R., & Ghatala, E. S. (1984). Memory strategy monitoring in adults and children. Journal of Verbal Learning and Verbal Behavior, 23, 270-288. doi: 10.1016/S0022-5371(84)90181-6

Ryan, R. M. (1982). Control and information in the intrapersonal sphere: An extension of cognitive evaluation theory. Journal of Personality and Social Psychology, 43, 450-461. doi: 10.1037/0022-3514.43.3.450


Improving Metacognition with Pre- and Post-Exam Reflection Exercises (Academic Advising)

Kyle E. Conlon (conlonke@sfasu.edu) and Lauren E. Brewer (brewerle@sfasu.edu), Stephen F. Austin State University

Background/Motivation: We teach psychology at a large southern university of approximately 13,000 students. Many of our students, especially freshman and first-generation students, possess ineffective study strategies that, understandably, lead to considerable frustration. They attend class, take careful notes, ask questions—do all the things we encourage them to do—and yet still underperform on their exams, leading them to ask, “What am I doing wrong?” When we ask students about their study strategies, we tend to find that they (1) rely on poor strategies (e.g., highlighting) and (2) lack insight into why their strategies aren’t working. Hence, we were motivated to create short pre- and post-exam reflection exercises to help students gain metacognitive awareness into their own study strategies.

Nuts & Bolts/Method: The pre-exam reflection exercise was designed for students to reflect on their exam preparation strategies and to identify obstacles to their studying (Appendix Table 1). The post-exam reflection exercise was designed for students to reflect on their exam performance and to determine whether it was necessary to change their study strategies for the next exam (Appendix Table 2). Fifty students (38 women, Mage = 21.10) across three psychology classes consented to participate. Each student completed four exams yielding 200 discreet observations in which a student could have completed no reflections (n = 154), pre-exam reflections only (n = 18), post-exam reflections only (n = 8), or both pre-and post-exam reflections (n = 20). For this study, we compared exam grades for students who completed both pre- and post-exam reflections to exam grades for students who completed neither pre- nor post-exam reflections. Participation was voluntary and students were informed that they could withdraw from the study at any time without penalty. In exchange for their participation, participants were entered into a raffle for one of three $100 gift cards. These gift cards were distributed at the end of the semester after final grades were submitted.

Outcomes/Lessons Learned: The exam scores of students who completed both pre- and post-exam reflections (Mgrade = 86.60, SD = 11.01) were significantly higher than exam scores of students who did not complete the reflections (Mgrade = 76.97, SD = 12.31) t(172) = 3.57, p < .001. Additionally, for each student the number of exam reflections completed was positively correlated with exam average (r = .42, p = .03) and with final course grade (r = .50, p = .01).

Our goal was to create brief reflection exercises to help our students gain insight into the effectiveness of their study strategies. More recently, we’ve begun to share these exercises with our academic advisees, some of whom consider dropping or avoiding classes due to poor performance. Although our specific guidance depends on the advisee, we generally encourage them to apply the exercises to the exams in the course or courses in which they’re struggling. We also try to review their responses with them to foster their metacognitive awareness (e.g., “I see you’re highlighting your notes and rereading the text; why do you think these strategies aren’t working?,” “So you felt prepared for this exam but underperformed; why do you think this happened?”). These exercises, which could be used by any academic advisor, jumpstart a discussion with advisees about how to study, which often gives them a renewed sense of hope and perspective for overcoming obstacles in their courses. In some cases, we’ll share specific articles from the metacognition literature (e.g., Putnam, Sungkhasettee, & Roediger, 2016) that dovetail with the use of these exercises. We typically meet with advisees once a semester for course selection, but we both have an open-door mentoring policy and encourage (and sometimes require) follow-up meetings with advisees, particularly those who are struggling and would benefit most from these exercises. Our experience suggests that advisees (1) generally possess poor insight into their studying (2) express surprise that their strategies aren’t as effective as they believe (or as research shows) and (3) through these exercises are forced to think through their study habits in a way they might not otherwise. We’re hopeful that improving advisees’ metacognition extends beyond the classroom to help improve their grades, motivate them beyond initial struggles, and prevent dropout.

References:

Gurung, R. A. R. (2005). How do students really study (and does it matter)? Teaching of Psychology, 32, 239–241.

Henderson, V., & Dweck, C. S. (1990). Motivation and achievement. In S. S. Feldman & G. R. Elliott (Eds.), At the threshold: The developing adolescent (pp. 308–329). Cambridge, MA: Harvard University Press.

Putnam, A. L., Sungkhasettee, V. W., & Roediger, H. L. (2016). Optimizing leaning in college: Tips from cognitive psychology. Perspectives on Psychological Science, 11, 652-660.

Trockel, M. T., Barnes, M. D., & Egget, D. L. (2000). Health-related variables and academic performance among first-year college students: Implications for sleep and other behaviors. Journal of American College Health, 49, 125–131. doi: 10.1080/07448480009596294


Thinking like a Sociologist, but how? Using Reflective Worksheets to Enhance Metacognition in a Classroom with Diverse Learners

Mabel Ho, Department of Sociology, University of British Columbia
Katherine Lyon, Department of Sociology and Vantage One, UBC
Jennifer Lightfoot, Academic English Program, Vantage One, UBC
Amber Shaw, Academic English Program, Vantage One, UBC

Background and Motivation for Using Reflective Worksheets in Introductory Sociology

Research shows that for first year students in particular, lectures interspersed with active learning opportunities are more effective than either pedagogical approach on their own (Harrington & Zakrajsek, 2017). In-class reflection opportunities are a form of active learning shown to enhance cognitive engagement (Mayer, 2009), critical thinking skills (Colley et al., 2012), and immediate and long-term recall of concepts (Davis & Hult, 1997) while reducing information overload which can limit learning (Kaczmarzyk et al., 2013). Further, reflection conducted in class has been shown to be more effective than outside of class (Embo et al., 2014). Providing students with in-class activities which explicitly teach metacognitive strategies has been shown to increase motivation, autonomy, responsibility and ownership of learning (Machaal, 2015) and improve academic performance (Aghaie & Zhang, 2012; Tanner, 2012).

We created and implemented reflective worksheets (See Appendix) in multiple sections of a first-year sociology course at a large research university with a high proportion of international English as an Additional Language (EAL) students. While all first-year students must learn to navigate both the academic and disciplinary-specific language expectations of university, for many international students additional barriers may exist. For these students, new expectations must be achieved through their additional language and with possible diverse cultural assumptions, such as being unfamiliar with active learning and thought processes privileged in a Western academic institution. With both domestic and international students in mind, our aims with these reflective worksheets are to:

  • facilitate and enhance students’ abilities to notice and monitor disciplinary awareness and knowledge while promoting disciplinary comprehension and practices.
  • connect course material to personal experiences (micro) and social trends (macro).

Nuts and Bolts: Method

We structured individual writing reflection opportunities every 10-15 minutes in each lecture in the small (25 students), medium (50 students) and large (100 students) classes. Each lesson was one hour and students completed the worksheets during class time in five-minute segments. The worksheets had different question prompts designed to help students:

  • identify affective and cognitive pre-conceptions about topics
  • paraphrase or explain concepts
  • construct examples of concepts just learned
  • contrast terms
  • describe benefits and limitations of social processes
  • relate a concept to their own lives and/or cultural contexts
  • discover connections between new material and prior knowledge (Muncy, 2014)
  • summarize key lecture points (Davis & Hult, 1997)
  • reflect on their own process of learning (see Appendix for further examples)

The question prompts are indicative of how to think about a topic, rather than what to think. These reflective worksheets are a way to teach students to think like disciplinary specialists in sociology, which align with the course learning outcomes. Completed worksheets were graded by Teaching Assistants (T.A.) who used the rubric below (see Table 1) to assess students’ application and critical thinking skills. By framing the worksheets as participation marks, students’ were motivated to complete the assigned work while learning how to approach sociology as a discipline. As suggested in “promoting conceptual change” (Tanner, 2012), some of the worksheets required students to recognize their preconceived notions and monitor their own learning and re-learning. For example, in one of the worksheets, students tracked their own preconceptions about a social issue (e.g. marijuana usage) in the beginning of the lecture and they returned to the same question at the end of class. Through this process, a student can have a physical record of his/her evolution of beliefs, whether it be recognizing and adjusting pre-conceived notions or deepening justifications for beliefs.

Table 1: Sample Assessment Rubric                                           

Sample Assessment Rubric
3 2 1
Entry is thoughtful, thorough and specific. Author draws on relevant course material where appropriate. Author demonstrates original thinking. Entries correspond to questions asked. Entry is relevant but may be vague or generic. Author could improve the response by making it more specific, thoughtful or complete. Entry is unclear, irrelevant, incomplete or demonstrates a lack of understanding of core concepts.

Outcomes: Lessons Learned

We found the reflective worksheets were effective because they gave students time to think about what they were learning and, over time, increased their awareness of disciplinary construction of knowledge. As instructors, the worksheets were a useful tool in monitoring students’ learning and ‘take away’ messages from the lectures. We also utilized the worksheets as a starting point in the next lecture to clarify any misunderstandings.

Overall, we found that while the reflective worksheets seemed to be appreciated by all the students, EAL students specifically benefitted from the worksheets in a number of ways. First, the guided questions gave students additional time to think about the topic on hand and preparation time before classroom discussion. Instead of cold-calling students, this reflective time allowed students’ to gather their thoughts and think about what they just learned in an active way. Second, students were able to explore the structure of academic discourse within the discipline of sociology. As students learn through different disciplinary lenses, these worksheets reveal how a sociologist will approach a topic. In our case, international EAL students are taking courses such as psychology, academic writing, and political science. Each of these disciplines engages with a topic using a different lens and language, and having the worksheet made the approach explicit. Last, the worksheets allow students to reflect on both the content and the way language is used within sociology. For example, the worksheets gave students time to brainstorm and think about what questions are explored from a disciplinary perspective and what counts as evidence. Furthermore, when given time to reflect on the strength of disciplinary evidence, students can then determine which language features may be most appropriate to present evidence, such as whether the use of hedges (may indicate, possibly suggest, etc.) or boosters (definitely proves) would be more appropriate. When working with international EAL students, it becomes extremely important to uncover language features so students can in turn take ownership of those language features in their own language use. Looking forward, these worksheets can help guide both EAL and non-EAL students’ awareness of how knowledge is constructed in the discipline and how language can be used to reflect and show their disciplinary understanding.

References

Aghaie, R., & Zhang, L. J. (2012). Effects of explicit instruction in cognitive and metacognitive reading strategies on Iranian EFL students’ reading performance and strategy transfer. Instructional Science40(6), 1063-1081.

Colley, B. M., Bilics, A. R., & Lerch, C. M. (2012). Reflection: A key component to thinking critically. The Canadian Journal for the Scholarship of Teaching and Learning, 3(1). http://dx.doi.org/10.5206/cjsotl-rcacea.2012.1.2

Davis, M., & Hult, R. E. (1997). Effects of writing summaries as a generative learning activity during note taking. Teaching of Psychology24(1), 47-50.

Embo, M. P. C., Driessen, E., Valcke, M., & Van Der Vleuten, C. P. (2014). Scaffolding reflective learning in clinical practice: a comparison of two types of reflective activities. Medical teacher36(7), 602-607.

Harrington, C., & Zakrajsek, T. (2017). Dynamic Lecturing: Research-based Strategies to Enhance Lecture Effectiveness. Stylus Publishing, LLC.

Kaczmarzyk, M., Francikowski, J., Łozowski, B., Rozpędek, M., Sawczyn, T., & Sułowicz, S. (2013). The bit value of working memory. Psychology & Neuroscience6(3), 345-349.

Machaal, B. (2015). Could explicit training in metacognition improve learners’ autonomy and responsibility? Arab World English Journal, 6(1), 267.

Mayer, R. E. (2009). Multimedia learning (2nd ed.). New York, NY: Cambridge University Press.

Muncy, J. A. (2014). Blogging for reflection: The use of online journals to engage students in reflective learning. Marketing Education Review, 24(2), 101-114. doi:10.2753/MER1052-8008240202

Tanner, K. D. (2012). Promoting student metacognition. CBE-Life Sciences Education11(2), 113-120.


Prompted Written Reflection as a Tool for Metacognition: Applying Theory in Feedback

Dr. Phani Radhakrishnan & Emma Kerr
Management Department, University of Toronto
Contact: phani.radhakrishnan@utoronto.ca & emma.kerr@mail.utoronto.ca

Background/Motivation

Understanding how to seek feedback is a core topic in the curriculum in leadership courses. However, not all feedback-seeking activities are effective. The purpose of this activity is to help students to apply empirical research about feedback to their experiences in receiving and interpreting feedback. We hoped the activity would enable them to understand how to seek and interpret feedback but to also learn about how to apply it to their own learning, and thus, encourage metacognitive thinking.

Method

This activity took place approximately midway through the semester in a third-year mandatory course for students in the business administration program. There were approximately 40 students in each class. Students were introduced to the rationale for the activity by reading DeNisi and Kluger’s (2000) review article about the relation between feedback and performance. Then they answered questions requiring them to explain the key concepts in the reading and apply the theory to a real-life example (see Appendix A). Then students listened to a short lecture that explained the theory behind the factors that increase and decrease the effectiveness of feedback. The lecture focused on the finding that the way people seek feedback can have an impact on their subsequent performance. If people seek task-based feedback by looking for the correct answer or, by asking for ways to improve their answer by focusing on how to learn the task, their subsequent performance on the task will improve. But if they seek self-based feedback by looking for information on how they did they did relative to others, by seeking information for the class average their subsequent performance on the task will not improve.

Then, students reflected on how they could apply this knowledge to their own lives. They wrote a response to the following question: “Consider a situation where you got feedback and that did not help you improve your subsequent performance. Explain why the feedback was ineffective in terms of task, task learning or self-focus. What could you have done to increase your focus on task, or task learning and decrease attention to self?” Then, we gave students examples of learning goals (see Appendix B) and asked them to write learning goals with reference to the example they just wrote about. Finally, students who volunteered, read their written reflections out loud to the rest of the class. Students were graded for participation for the homework activity as well as for their participation the in-class discussion. Finally, students were asked a similar question on the final exam about feedback they received in this course. Specifically, they had to reflect on the feedback they received from an assignment in the course where they were evaluated on argumentation, definitional, data analytical, and descriptive skills and to set a learning goal on how to improve themselves on these skills. We hoped that these multiple writing prompts would encourage students to apply research about feedback to their own experiences with receiving and seeking feedback. Such written reflections should encourage students to learn about how they may help or hurt themselves by the kind of information they focus on when asking for feedback from instructors (e.g., asking for the class average vs. asking for how to improve, or asking for the correct answer).

Our activity was guided by research which suggests that providing a written prompt that encourages students to critically think about their own experiences can encourage metacognition. For example, Ratto-Parks (2015) asked first-year college students to think about a rhetorical story assignment they had completed in a course and reflect on what they did well and where they could improve. She found that student reflections improved metacognition and strengthened writing quality. Just as Ratto-Parks’ activity encouraged students to reflect on, and thus improve their writing skills, we hoped that our questions guided students on the kinds of information they should focus on while seeking feedback and also encouraged students to meta-cognize about their experiences with feedback, and to reflect on how to use feedback-seeking opportunities to improve themselves. Similarly, other studies have found that the content of written reflections that prompt critical reflection can elicit metacognitive processes (Erksine, 2009; Harten 2014; Lew & Schmidt 2010).

Further, feedback itself has also been shown to improve metacognition (Callender, 2016). In our activity, we evaluated students on multiple skills in their course-related writing assignment (e.g., argumentation, definitional skills etc.) and then asked students to reflect on what that feedback means. By asking students to relate information learned in the course to their past feedback-seeking experiences and by providing opportunities to apply that knowledge while they are getting feedback in the course, we think we are helping students to improve their metacognitive skills since they are using both written reflections and feedback as tools to develop such skills. Taken together, the in-class writing exercise, an explanation of the theory behind feedback, an opportunity to get feedback, and answering a question on the final exam about that feedback should all improve meta-cognitive skills. This is also predicted by past research cited above.

Outcomes

Preliminary analysis shows that highly engaged students (i.e., those that read the article, answered the homework questions, wrote a reflection and participated in class discussion) tended to achieve higher marks on the related final exam question. Overall, students showed an improved understanding of effective feedback following the in-class activity. We are motivated to continue to systematically analyze student responses to the initial in-class reflection questions and to the final exam questions. We hope to detect metacognitive thinking by using Ratto-Parks’ Index of Metacognitive Knowledge in Critical

Reflective Writing, which shows promise in translating metacognitive language into identifiable traits that can be used to assess students’ reflections (2015). This analysis could be challenging because our activity consists of only one in-class reflection question based on prior feedback-seeking experiences and one final exam question based on a feedback-seeking experience in the course itself. Most studies include multiple written reflections. To detect improvement in metacognition we may need to encourage students to repeatedly answer questions about what they are learning in multiple feedback contexts. This is similar to our prior research (Radhakrishnan, Arrow, & Sniezek, 1996), which shows that asking students to repeatedly evaluate their performance over multiple tests after receiving feedback on each test improves the accuracy of their evaluations. Improving students’ understanding of what they are learning, that is, their meta-cognitive skills, may also follow a similar mechanism. Multiple written reflections about how to interpret feedback while getting feedback on multiple tasks can not only help students gain an improved understanding of the theory of feedback but also about themselves.

We expect that both the improved understanding of effective feedback processes as well as the opportunity to practice metacognition will help students to interpret and give feedback more effectively both within and outside of the course. Since our students are in the management discipline, seeking feedback effectively is a skill that is essential to their professional development as leaders. In addition, we predict that the improved experience with metacognitive processes will aid them in thinking critically and interpreting feedback in their other courses as well.

References

DeNisi, A. S., & Kluger, A. N. (2000). Feedback effectiveness: Can 360-degree appraisals be improved? Academy of Management Perspectives, 14(1), 129-139. doi:10.5465/ame.2000.2909845

Erskine, D. L. (2009). Effect of prompted reflection and metacognitive skill instruction on university freshmen’s use of metacognition (Doctoral dissertation). Retrieved from ProQuest Dissertations and Theses database.

Harten, M. D. (2014). An evaluation of the effectiveness of written reflection to improve high school students’ metacognitive knowledge and strategies (Doctoral dissertation). Retrieved from ProQuest Dissertations and Theses database.

Lew, D. N., & Schmidt, H. G. (2011). Writing to learn: can reflection journals be used to promote self-reflection and learning? Higher Education Research & Development, 30(4), 519-532. doi:10.1080/07294360.2010.512627

Radhakrishnan, P., Arrow, H., & Sniezek, J. A. (1996). Hoping, Performing, learning, and Predicting: Changes in the Accuracy of Self-Evaluations of Performance. Human Performance, 9(1), 23-49. doi:10.1207/s15327043hup0901_2

Ratto Parks, A. E. (2015). The power of critical reflection: Exploring the impact of rhetorical stories on metacognition in first year composition stories (Doctoral dissertation).

*

Appendix A—Homework Question

The way in which feedback is given can draw one’s attention to oneself and this attention to self leads to negative effects on subsequent performance (after the feedback). However, it also discusses conditions where feedback focused on the self, may not necessarily lead to negative effects. Explain how this occurs.

Use the concept of ought vs. ideal self, and promotion vs. prevention focus. Illustrate how this theory occurs by applying it in a concrete real-life situation or example.

*

Appendix B—Examples of Learning Goals

(The following is displayed on a slide in lecture to aid students in developing their own learning goals)

For a professor…

  • Finding specific ways to explain complex material in memorable ways
  • Explain concepts by giving examples & counter examples
  • Explain theories by giving concrete examples of process of how it works
  • Show the relevance of the subject matter to the students’ lives outside the classroom

For a golfer…

  • Mastering the proper grip of the club
  • Master proper placement of the feet
  • Learning when to use what club
  • Understanding the distribution of weight from one foot to the other when swinging the club

Practicing Metacognitive Awareness with Guided Lecture Notes

Dr. Terrell Hooper
Assistant Professor of Music
American University of Sharjah
Email | thooper@aus.edu

Background/Motivation:

I teach an Elements of Music course for music minors and a general populous of engineering, business and architecture students needing to earn a general arts credit. I have experienced many challenges in teaching such a course in the Middle East where students have never been exposed to any elements of western music or history. The course surveys the entire gamut of western music and history, while simultaneously giving a foundational understanding of music literacy. Given the vast parameters of the course, students are expected to have strong independent study skills. While study habits are primarily individual and differ with each student, I found students not prepared or equipped with basic study skills required to be successful in the course. The most basic skill that was lacking was the ability to take notes or organize the material being discussed in class. In addition, student feedback on end of course evaluations revealed that information and material discussed in class was so unfamiliar and vast that students did not know how to organize or digest the information. From the gathered data, I inferred that students needed a note-taking model, an opportunity to take notes on their own volition, and a moment to reflect on their note taking abilities. By implementing the aforementioned objectives, I wanted to observe whether or not said objectives would encourage students to think in a metacognitive manner and would perhaps be awakened to the importance of metacognitive practices regarding their own study habits.

Method:

Due to the pedagogical “bumps” that I experienced in my first semester of teaching Elements of Music, I decided to create a sequential curriculum (see Figure 1) that would provide students with guided lecture notes[1]. The purpose of these notes were two-fold: 1) help students structure information being discussed during class 2) help students remember, reflect, and re-organize course content during independent study. The sequential curriculum was in line with the syllabus and students were not educated about note taking skills, but were merely provided with guided lecture notes that I prepared prior to each class meeting.

Three guided lecture notes were given over a three-week period (see Appendix A). Each week the guided lecture notes were designed to incorporate a progressive guide for helping students become more metacognitive aware of proper note taking habits during in-class lectures. The first guided lecture notes were designed to orient students to the process of taking notes in an outline format and contained fill in the blank areas that were curated throughout the outline. Subsequent guided lecture notes included reflective questions at the end of the lecture. Lecture 2 contained recall questions and Lecture 3 contained essay questions concerning content that was discussed during the lecture. All three guided lecture notes were collected after each class and data recorded on how many students completed the entire handout and rated on its overall completion (i.e number of blanks left on the handout). Lecture 4 (the Classical Lecture) did not use guided lecture notes and no instruction or requirements for note taking was given to students because I wanted to observe how many students saw the need to take notes of their own volition.

Following the review session (see Figure 1) a midterm exam was administered. The midterm exam consisted of multiple choice, fill in the blank, and true or false questions and were copied verbatim from the previous semester exam so data could be compared with how students not exposed to guided lecture notes scored on the same questions. After the midterm exam, students were given a survey via Google Forms and were asked questions regarding the usefulness of the guided lecture notes. Finally, I gave a ten-minute lecture that informed students on the data gathered in the questionnaire, statistics on how many students completed each handout during each lecture, and the exam scores from students who used guided lecture notes with students who did not use guided lecture notes in the Fall semester.

Figure 1. Sequential Curriculum for Guided Lecture Notes

                        Data Outcomes:

All students enrolled in Elements of Music for Spring semester participated in the study (n=29), however, due to random class absences, Lecture 1 had 27 participants, Lecture 2 had 28 participants and Lecture 3 had 25 participants. A set of 30 questions derived directly from the lecture notes were used on the midterm exam for students in Spring semester (n=29) and the final exam for students from Fall semester (n=28). Each exam question (n=30) was scored as correct or incorrect on both Fall and Spring student exams and the total number of incorrect answers was calculated for each student. An independent t-test revealed no significant difference between groups, t(53)=1.02, p=.31; Mean (Std Dev) Fall Semester = 4.9 (3.4) and Mean (Std Dev) Spring Semester = 4.0 (2.9).

On a more positive note regarding the incorporation of the guided lecture notes, students who participated in the questionnaire (n=23) gave strongly positive ratings for the notes. They rated their overall satisfaction on a 3-point Likert scale choosing between unsatisfactory, satisfactory continuum and extremely satisfactory. Results indicated 69.6% (n=15) of students surveyed were extremely satisfied with using guided lecture notes and 30.4% (n=7) of students chose the middle option, indicating neither unsatisfied nor extremely satisfied. Open-ended student feedback on using teacher guided lecture notes is represented in Table 1.

Table 1.Student Feedback Using Guided Lecture Notes

Pros Cons
“Provides important details and helps us focus on what is more important” “More detailed questions”
“Guides me through the chapters while studying from the book” “Sometimes the questions are vague and need clarification”
“They were a very good guide when it came to studying for midterms as they summarized the main concepts“ “Include a list of keywords”
“These outlines make it easier to understand and absorb the material faster” “The information was a lot and we didn’t have enough time to complete it during class while the professor was explaining it. Sometimes I felt I couldn’t keep up the pace while listening to the lecture and writing thus I left many blanks to fill in later which made me unsure of my answers.”

 Observations:

The primary purpose of this research study was to 1) help students structure information being discussed during class 2) help students remember, reflect, and re-organize course content during independent study. The study illuminated the fact that when students organize, reflect, and collaborate with their teachers on their own learning it improves the pedagogical process. Although the data does not necessarily confirm that guided lecture notes improves test scores, it would be remiss to not acknowledge that students do enjoy being provided with a structure for organizing the information presented during lectures. In addition, no negative feedback concerning the amount of material or organizational components of the course were received on end-of-course student evaluations.

The intent of helping students take personal initiative on using guided lecture notes in Lecture 4 (see Figure 1) and giving an informative ten-minute lecture on the possible gains of using such an organizational scheme when listening to class lectures was to help students to think more about their own study skills. However, generally speaking, I did not observe a change in the majority of classroom behavior with students beginning to practice metacognition regarding their own study habits. I actually observed students wanting or expecting the guided lecture notes for every class. The end-of-course student evaluations even noted that students wanted guided lecture notes for each class lecture. Even though students positively reflected on the usefulness of the guided lecture notes, I observed a disconnect in motivating students to take personal initiative for their personal study habits. Future research should investigate the link between in-class lectures and how students become more self-directed within their own independent study habits.

[1] Guided notes are defined as “teacher-prepared handouts that ‘guide’ a student through a lecture with standard cues and prepared space in which to write the key facts, concepts, and/or relationships” (Heward, 1994, p. 304).


Metacognition supports HIP undergraduate research

by Dr. John Draeger, SUNY Buffalo State

The Association of American Colleges and Universities (AAC&U) has identified a number of high-impact learning practices (e.g., undergraduate research, collaborative assignments, learning communities, service learning, study abroad, capstone seminars). Each of these learning practices involve a significant investment of student effort over time with multiple interactions between faculty and students about substantive matters as well as frequent,constructive feedback from faculty, and regular, structured processes for reflection and integration (Kuh 2008; Kilgo, Sheets & Pascarella 2015). This post offers some strategies for intentionally structuring undergraduate research experiences and building metacognition into the process. Subsequent posts will consider other high-impact practices (HIPs).

 Undergraduate research is a HIP because students ask the questions and set the research agenda. Inquiry-based projects, such as undergraduate research, promote student autonomy, self-direction, and teach students about the process (Healey & Jenkins 2009; Kilgo & Pascarella 2016). Without guidance, however, students can find themselves in a hot mess. After years of mentoring undergraduate research projects in philosophy, I’ve developed the following model to help keep students on track. Elements of this model may seem obvious and common practice. I don’t claim that it is novel, but I offer it as a distillation of some lessons that I’ve learned the hard way.

First, philosophers like to ask the big questions (and they should), but unless topics are reined in, student research can easily turn to sprawl and sloppy thinking. Thus, I talk with students about topic refinement early and often. I begin student meetings by asking them to give a one-minute “elevator pitch” for their topic. As the topic gets refined, the pitch becomes easier. En route to refining the topic and developing the elevator pitch, I ask a series of critical questions about the underlying conceptual issues. For example, if a student wants to consider what parents owe their children, I will push her to consider the nature of obligation (e.g., human rights, fairness, well-being, character, social roles) and concrete cases that may or may not fall within the scope of that obligation (e.g., providing food, a new bike, college tuition). Prodding them to consider the nature and scope of the obligation prompts them to consider the underlying philosophical substructure, which is what I believe philosophical inquiry is all about (Draeger 2014). However, once students begin making deep conceptual connections, it is easy for a topic to sprawl as students believe that each connected idea will need its own separate discussion. Metacognition encourages students to be aware of their own learning process (e.g., research) and make intentional adjustments based on that awareness. Encouraging students to be aware of the possibility topic sprawl can help them better evaluate whether their current thinking is moving away from the core issue or towards a better version of that core issue.

Second, all of us are standing on the shoulders of giants. It is good scholarship to acknowledge the original thinking efforts of others by using proper citation. However, the research experience should teach students more than to not plagiarize. Rather, undergraduate research allows students the opportunity to become co-inquirers within an existing scholarly conversation. Becoming familiar with the literature allows them to tap into long-standing debates and utilize conceptual distinctions developed by others. As students begin their research, each comes with their own background and dispositions. Some believe they need to read everything on a topic before they venture an opinion. Others are so eager to begin that they skip the literature review and soon find themselves lost without the resources found within the tradition. Metacognition can help students become aware of when they are reading too much or too little as well as point the way to adjustments in their process.

Third, many students struggle with how to find the relevant source material in philosophy. Even if they know how to use the library, they are often unfamiliar with idiosyncrasies of philosophy as a discipline. For this reason, I explicitly discuss how to go about doing library work (e.g., how to use library databases, how to conduct keyword searches, how to decide which articles seem promising), discuss reading strategies (e.g., how to read at different speeds to find articles most deserving attention, how to read identified articles more carefully, how to annotate a text with an eye towards research), and discuss note taking strategies (e.g., how to organize summaries, critical questions, conceptual applications, personal reflections). When undergraduate research is embedded in my course, we discuss these strategies in class. When undergraduate research takes the form of an independent project, I discuss these strategies one-on-one. In either case, I encourage students to practice becoming aware of what’s working, what’s not, and when they need to adjust their strategies.

Fourth, my undergraduate research students are required to keep a weekly journal. Students are asked to track pesky questions, troublesome counter-examples, and worrisome objections. Beyond their focus on content, however, students are also asked to focus on their own process, including a sketch of the library, reading, and writing strategies attempted as well as whether those strategies were successful. Journaling about these strategies is another way to encourage metacognitive awareness about the research process and locate opportunities for intentional self-regulation.

Undergraduate research can be a HIP (if implemented well) because it encourages students to learn about the research process on their own terms as well as producing their own research product. Metacognition helps monitor whether students are engaged in the sort of deep learning that makes undergraduate research a HIP.  Moreover, intentionally structuring metacognitive opportunities can encourage greater learner autonomy and help facilitate inquiry-based research long after undergraduate experiences have officially concluded. In this way, undergraduate research and metacognition can be highly-impactful because they support the skills necessary for lifelong learning.

References

Draeger, J. (posted July 11, 2014). Using metacognition to uncover the substructure of moral issues.” Retrieved from https://www.improvewithmetacognition.com.

Healey, M., & Jenkins, A. (2009). Developing undergraduate research and inquiry. York: HE Academy.

Kilgo, C. A., Sheets, J. K. E., & Pascarella, E. T. (2015). The link between high-impact practices and student learning: Some longitudinal evidence. Higher Education, 69(4), 509-525.

Kilgo, C. A., & Pascarella, E. T. (2016). Does independent research with a faculty member enhance four-year graduation and graduate/professional degree plans? Convergent results with different analytical methods. Higher Education, 71(4), 575-592.

Kuh, G. D. (2008). Excerpt from high-impact educational practices: What they are, who has access to them, and why they matter. Association of American Colleges and Universities.


Small Metacognition – Part II

By Jennifer A. McCabe, Ph.D., Goucher College

Just before the start of this spring semester, I decided to make a change in the structure of readings and discussions in my upper-level seminar course on Cognition, Teaching, and Learning. I had recently read James Lang’s (2016) book, Small Teaching: Everyday Lessons from the Science of Learning, and was inspired to include it in my class. Instead of jumping into a discussion of research articles, with a few popular press articles or book chapters included toward the end of the semester as examples of translational science writing, I flipped the order and instead started the course with three weeks reading, discussing, and applying the information from Lang’s book (syllabus available upon request).

book cover - James Lang's Small Teaching

As described in my previous blog post (Small Metacognition I) the premise of Small Teaching is that evidence-based, incremental shifts in how teachers structure and deliver educational experiences can have a large pay-off in terms of student learning and engagement. The book speaks to multiple aspects of teacher metacognition (knowing about (students’) knowing, thinking about (students’) thinking, and learning about (students’) learning), even though the term itself is never mentioned.

My students were assigned to read one Small Teaching chapter per class day. For each, they prepared ‘Reading Responses’ consisting of three short paragraphs – from the perspective of a student, an educator, and a cognitive psychologist. At the start of each class period, they completed a ‘Comprehension Check’ question meant to give them feedback on their own learning of the day’s readings. This was self-graded with a check-plus/check/check-minus system, and was low-stakes in that only effort and completion counted. I mostly led these class periods, administering the Comprehension Check, engaging them in some type of active learning activity relevant to the day’s topic, and facilitating a discussion of the reading based on their Reading Responses. This first portion of the class was designed to help them learn about effective teaching through Lang’s book, and also through modeling my own class design and delivery.

This became important because after we finished the nine book chapters, we then shifted into primary source readings of research articles related to applied memory and the Scholarship of Teaching and Learning. During each class period with an article assigned, two students took the role of Discussion Leader; using what they learned from Small Teaching, and in consultation with me ahead of time, they curated and led a class period that included a Comprehension Check question (which should be effective for learning based on findings from memory research on testing as described in Lang’s Retrieving chapter), an active learning exercise (known to be effective based on ideas from the Connecting chapter), and an interactive discussion (relevant to elaboration-based strategies described in the Self-Explaining chapter). Students engaged in conversation about how the current article related to their reading of Small Teaching earlier in the semester (which itself highlighted the usefulness of spacing and mixing of topics, as described in the Interleaving chapter).

During these student-led class periods, I essentially became a member of the class, participating in demonstrations and discussions as a contributor but not as a leader. The Discussion Leaders were empowered in their choices of how to make the class period effective and engaging for their peers. Following the class, I provided feedback to the groups, with a particular emphasis early in the semester on strategies for the next time they led the class. The Discussion Leader experience helped to develop their metacognition by thinking intentionally about the most effective learning experiences, and then how to design and deliver them. There were also times when they had to change teaching/learning strategies midstream, if something was not working well (another component of sophisticated metacognition).

In order to better understand the student experience of reading Small Teaching, which is not aimed at an undergraduate audience, and also to more directly connect to topics in metacognition, I gave the thirteen students in my class an optional free-response survey completed during our final class period. In the spirit of transparency, I had them read my Small Metacognition I blog post first, and explained my plan to write a follow-up post about the class experience with Lang’s book. Each decided whether or not to allow me to use their responses. Though I use names below, pseudonyms are used as needed to reflect my students’ preference.  (Students, if you’re reading this, thank you for contributing to this post!)

I first asked about the most important or memorable lessons from Small Teaching. Several mentioned specific topics or chapters that were impactful, namely retrieving, interleaving, practicing, motivating, growing, and expanding. Many wrote about using evidence to inform teaching in ways that are incremental rather than complete overhauls. For example, Elise wrote, “The current way courses are structured are pretty terrible for durable learning. In order to better structure courses, professors can implement small but impactful techniques to encourage better learning in class as well as guide students towards more empirically supported learning and study methods.” Though metacognition did not come up directly, several responses were related. Anna wrote, “The most important lesson is that learning is complex and that there are many factors at play in the classroom.” Megan commented that it is critical that “both parties (teacher and student) understand exactly why they are doing what they do to learn.” And Katherine noted, “It made me reflect on my own experience in academics and my growth as a learner.”

Next I asked in what ways they think that Small Teaching has (or will) changed the way you think or act in the world. Here students clearly referenced metacognitive development, with Addy saying it “introduced a more metacognitive approach in education to me,” and Samantha noting, “I now have this toolbox of ways I can implement effective strategies.” Noah said, “I feel I have a better understanding of how my mindset can affect my ability to learn.” Anna’s response captured multiple aspects of metacognition: “As a student, Small Teaching (and our larger course discussions) has already shifted the way I think about and articulate my learning experiences. I really think the metacognitive awareness of learning how to learn has helped me to think about the strategies I have used and would like to further implement in my future learning.” Two additional students commented on improved metacognitive awareness.

Finally students were asked whether they would recommend keeping Small Teaching as a core reading in this seminar course. Every student responded positively. They appreciated the book coming at the start of the class as a foundation for the research articles they would be reading, as a way to take an educator’s perspective on learning and memory research, and as an example of a translational piece (they created their own translational projects later in the semester).

I came away from this experience feeling pleased with my decision to incorporate Small Teaching into this class, and also feeling as though I myself had a significant learning (and metacognitive!) experience. Hopefully my students – most of whom were seniors, and some of whom will become teachers – will leave this course with a more sophisticated metacognitive perspective not only toward their own learning, but also toward purposeful and transparent design of educational experiences that effectively support others’ learning.

Recommended Reading

Lang, J. M. (2016). Small Teaching. San Francisco, CA: Jossey-Bass.


Supporting Student Self-Assessment with Knowledge Surveys

by Dr. Lauren Scharff, U. S. Air Force Academy*

In my earlier post this year, “Know Cubed” – How do students know if they know what they need to know?, I introduced three challenges for accurate student self-assessment. I also introduced the idea of incorporating knowledge surveys as a tool to support student self-assessment (an aspect of metacognitive learning) and promote metacognitive instruction. This post shares my first foray into the use of knowledge surveys.

What exactly are knowledge surveys? They are collections of questions that support student self-assessment of their course material understanding and related skills. Students complete the questions either at the beginning of the semester or prior to each unit of the course (pre), and then also immediately prior to exams (post-unit instruction). When answering the questions, students rate themselves on their ability to answer the question (similar to a confidence rating) rather than fully answering the question. The type of learning expectation is highlighted by including the Bloom’s level at the end of each question. Completion of knowledge surveys develops metacognitive awareness of learning and can help guide more efficient studying.

Example knowledge survey questions
Example knowledge survey questions

My motivation to include knowledge surveys in my course was a result of a presentation by Dr. Karl Wirth, who was invited to be the keynote speaker at the annual SoTL Forum we hold at my institution, the United States Air Force Academy. He shared compelling data and anecdotes about his incorporation of knowledge surveys into his geosciences course. His talk inspired several of us to try out knowledge surveys in our courses this spring.

So, after a semester, what do I think about knowledge surveys? How did my students respond?

In a nutshell, I am convinced that knowledge surveys enhanced student learning and promoted student metacognition about their learning. Their use provided additional opportunities to discuss the science of learning and helped focus learning efforts. But, there were also some important lessons learned that I will use to modify how I incorporate knowledge surveys in the future.

Evidence that knowledge surveys were beneficial:

My personal observations included the following, with increasing levels of each as the semester went on and students learned how to learn using the knowledge survey questions:

  • Students directly told me how much they liked and appreciated the knowledge survey questions. There is a lot of unfamiliar and challenging content in this upper-level course, so the knowledge survey questions served as an effective road map to help guide student learning efforts.
  • Students asked questions in class directly related to the knowledge survey questions (as well as other questions). Because I was clear about what I wanted them to learn, they were able to judge if they had solid understanding of those concepts and ask questions while we were discussing the topics.
  • Students came to office hours to ask questions, and were able to more clearly articulate what they did and did not understand prior to the exams when asking for further clarifications.
  • Students realized that they needed to study differently for the questions at different Bloom’s levels of learning. “Explain” questions required more than basic memorization of the terms related to those questions. I took class time to suggest and reinforce the use of more effective learning strategies and several students reported increasing success and the use of those strategies for other courses (yay!).
  • Overall, students became more accurate in assessing their understanding of the material prior to the exam. More specifically, when I compared the knowledge survey reports with actual exam performance, students progressively became more accurate across the semester. I think some of this increase in accuracy was due to the changes stated in points above.

Student feedback included the following:

  • End-of-semester feedback from students indicated that vast majority of them thought the knowledge surveys supported their learning, with half of them giving them the highest rating of “definitely supports learning, keep as is.”
  • Optional reflection feedback suggested development of learning skills related to the use of the knowledge surveys and perceived value for their use. The following quote was typical of many students:

At first, I was not sure how the knowledge surveys were going to help me. The first time I went through them I did not know many of the questions and I assumed they were things I was already supposed to know. However, after we went over their purpose in class my view of them changed. As I read through the readings, I focused on the portions that answered the knowledge survey questions. If I could not find an answer or felt like I did not accurately answer the question, I bolded that question and brought it up in class. Before the GR, I go back through a blank knowledge survey and try to answer each question by myself. I then use this to compare to the actual answers to see what I actually need to study. Before the first GR I did not do this. However, for the second GR I did and I did much better.

Other Observations and Lessons learned:

Although I am generally pleased with my first foray into incorporating knowledge surveys, I did learn some lessons and I will make some modifications next time.

  • The biggest lesson is that I need to take even more time to explain knowledge surveys, how students should use them to guide their learning, and how I use them as an instructor to tailor my teaching.

What did I do this past semester? I explained knowledge surveys on the syllabus and verbally at the beginning of the semester. I gave periodic general reminders and included a slide in each lesson’s PPT that listed the relevant knowledge survey questions. I gave points for completion of the knowledge surveys to increase the perception of their value. I also included instructions about how to use them at the start of each knowledge survey:

Knowledge survey instructions
Knowledge survey instructions

Despite all these efforts, feedback and performance indicated that many students really didn’t understand the purpose of knowledge surveys or take them seriously until after the first exam (and some even later than that). What will I do in the future? In addition to the above, I will make more explicit connections during the lesson and as students engage in learning activities and demonstrations. I will ask students to share how they would explain certain concepts using the results of their activities and the other data that were presented during the lesson. The latter will provide explicit examples of what would (or would not) be considered a complete answer for the “explain” questions in contrast to the “remember” questions.

  • The biggest student feedback suggestion for modification of the knowledge surveys pertained to the “pre” knowledge surveys given at the start of each unit. Students reported they didn’t know most of the answers and felt like completion of the pre knowledge surveys was less useful. As an instructor, those “pre” responses helped me get a pulse on their level or prior knowledge and use that to tailor my lessons. Thus, I need to better communicate my use of those “pre” results because no one likes to take time to do what they perceive is “busy work.”
  • I also learned that students created a shared GoogleDoc where they would insert answers to the knowledge survey questions. I am all for students helping each other learn, and I encourage them to quiz each other so they can talk out the answers rather than simply re-reading their notes. However, it became apparent when students came in for office hours that the shared “answers” to the questions were not always correct and were sometimes incomplete. This was especially true for the higher-level questions. I personally was not a member of the shared document, so I did not check their answers in that document. In the future, I will earlier and more explicitly encourage students to be aware of the type of learning being targeted and the type of responses needed for each level, and encourage them to critically evaluate the answers being entered into such a shared document.

In sum, as an avid supporter of metacognitive learning and metacognitive instruction, I believe that knowledge surveys are a great tool for supporting both student and faculty awareness of learning, the first step in metacognition. We then should use that awareness to make necessary adjustments to our efforts – the other half of a continuous cycle that leads to increased student success.

———————————————–

* Disclaimer: The views expressed in this document are those of the author and do not reflect the official policy or position of the U. S. Air Force, Department of Defense, or the U. S. Govt.