Thinking like a Sociologist, but how? Using Reflective Worksheets to Enhance Metacognition in a Classroom with Diverse Learners

Mabel Ho, Department of Sociology, University of British Columbia
Katherine Lyon, Department of Sociology and Vantage One, UBC
Jennifer Lightfoot, Academic English Program, Vantage One, UBC
Amber Shaw, Academic English Program, Vantage One, UBC

Background and Motivation for Using Reflective Worksheets in Introductory Sociology

Research shows that for first year students in particular, lectures interspersed with active learning opportunities are more effective than either pedagogical approach on their own (Harrington & Zakrajsek, 2017). In-class reflection opportunities are a form of active learning shown to enhance cognitive engagement (Mayer, 2009), critical thinking skills (Colley et al., 2012), and immediate and long-term recall of concepts (Davis & Hult, 1997) while reducing information overload which can limit learning (Kaczmarzyk et al., 2013). Further, reflection conducted in class has been shown to be more effective than outside of class (Embo et al., 2014). Providing students with in-class activities which explicitly teach metacognitive strategies has been shown to increase motivation, autonomy, responsibility and ownership of learning (Machaal, 2015) and improve academic performance (Aghaie & Zhang, 2012; Tanner, 2012).

We created and implemented reflective worksheets (See Appendix) in multiple sections of a first-year sociology course at a large research university with a high proportion of international English as an Additional Language (EAL) students. While all first-year students must learn to navigate both the academic and disciplinary-specific language expectations of university, for many international students additional barriers may exist. For these students, new expectations must be achieved through their additional language and with possible diverse cultural assumptions, such as being unfamiliar with active learning and thought processes privileged in a Western academic institution. With both domestic and international students in mind, our aims with these reflective worksheets are to:

  • facilitate and enhance students’ abilities to notice and monitor disciplinary awareness and knowledge while promoting disciplinary comprehension and practices.
  • connect course material to personal experiences (micro) and social trends (macro).

Nuts and Bolts: Method

We structured individual writing reflection opportunities every 10-15 minutes in each lecture in the small (25 students), medium (50 students) and large (100 students) classes. Each lesson was one hour and students completed the worksheets during class time in five-minute segments. The worksheets had different question prompts designed to help students:

  • identify affective and cognitive pre-conceptions about topics
  • paraphrase or explain concepts
  • construct examples of concepts just learned
  • contrast terms
  • describe benefits and limitations of social processes
  • relate a concept to their own lives and/or cultural contexts
  • discover connections between new material and prior knowledge (Muncy, 2014)
  • summarize key lecture points (Davis & Hult, 1997)
  • reflect on their own process of learning (see Appendix for further examples)

The question prompts are indicative of how to think about a topic, rather than what to think. These reflective worksheets are a way to teach students to think like disciplinary specialists in sociology, which align with the course learning outcomes. Completed worksheets were graded by Teaching Assistants (T.A.) who used the rubric below (see Table 1) to assess students’ application and critical thinking skills. By framing the worksheets as participation marks, students’ were motivated to complete the assigned work while learning how to approach sociology as a discipline. As suggested in “promoting conceptual change” (Tanner, 2012), some of the worksheets required students to recognize their preconceived notions and monitor their own learning and re-learning. For example, in one of the worksheets, students tracked their own preconceptions about a social issue (e.g. marijuana usage) in the beginning of the lecture and they returned to the same question at the end of class. Through this process, a student can have a physical record of his/her evolution of beliefs, whether it be recognizing and adjusting pre-conceived notions or deepening justifications for beliefs.

Table 1: Sample Assessment Rubric                                           

Sample Assessment Rubric
3 2 1
Entry is thoughtful, thorough and specific. Author draws on relevant course material where appropriate. Author demonstrates original thinking. Entries correspond to questions asked. Entry is relevant but may be vague or generic. Author could improve the response by making it more specific, thoughtful or complete. Entry is unclear, irrelevant, incomplete or demonstrates a lack of understanding of core concepts.

Outcomes: Lessons Learned

We found the reflective worksheets were effective because they gave students time to think about what they were learning and, over time, increased their awareness of disciplinary construction of knowledge. As instructors, the worksheets were a useful tool in monitoring students’ learning and ‘take away’ messages from the lectures. We also utilized the worksheets as a starting point in the next lecture to clarify any misunderstandings.

Overall, we found that while the reflective worksheets seemed to be appreciated by all the students, EAL students specifically benefitted from the worksheets in a number of ways. First, the guided questions gave students additional time to think about the topic on hand and preparation time before classroom discussion. Instead of cold-calling students, this reflective time allowed students’ to gather their thoughts and think about what they just learned in an active way. Second, students were able to explore the structure of academic discourse within the discipline of sociology. As students learn through different disciplinary lenses, these worksheets reveal how a sociologist will approach a topic. In our case, international EAL students are taking courses such as psychology, academic writing, and political science. Each of these disciplines engages with a topic using a different lens and language, and having the worksheet made the approach explicit. Last, the worksheets allow students to reflect on both the content and the way language is used within sociology. For example, the worksheets gave students time to brainstorm and think about what questions are explored from a disciplinary perspective and what counts as evidence. Furthermore, when given time to reflect on the strength of disciplinary evidence, students can then determine which language features may be most appropriate to present evidence, such as whether the use of hedges (may indicate, possibly suggest, etc.) or boosters (definitely proves) would be more appropriate. When working with international EAL students, it becomes extremely important to uncover language features so students can in turn take ownership of those language features in their own language use. Looking forward, these worksheets can help guide both EAL and non-EAL students’ awareness of how knowledge is constructed in the discipline and how language can be used to reflect and show their disciplinary understanding.

References

Aghaie, R., & Zhang, L. J. (2012). Effects of explicit instruction in cognitive and metacognitive reading strategies on Iranian EFL students’ reading performance and strategy transfer. Instructional Science40(6), 1063-1081.

Colley, B. M., Bilics, A. R., & Lerch, C. M. (2012). Reflection: A key component to thinking critically. The Canadian Journal for the Scholarship of Teaching and Learning, 3(1). http://dx.doi.org/10.5206/cjsotl-rcacea.2012.1.2

Davis, M., & Hult, R. E. (1997). Effects of writing summaries as a generative learning activity during note taking. Teaching of Psychology24(1), 47-50.

Embo, M. P. C., Driessen, E., Valcke, M., & Van Der Vleuten, C. P. (2014). Scaffolding reflective learning in clinical practice: a comparison of two types of reflective activities. Medical teacher36(7), 602-607.

Harrington, C., & Zakrajsek, T. (2017). Dynamic Lecturing: Research-based Strategies to Enhance Lecture Effectiveness. Stylus Publishing, LLC.

Kaczmarzyk, M., Francikowski, J., Łozowski, B., Rozpędek, M., Sawczyn, T., & Sułowicz, S. (2013). The bit value of working memory. Psychology & Neuroscience6(3), 345-349.

Machaal, B. (2015). Could explicit training in metacognition improve learners’ autonomy and responsibility? Arab World English Journal, 6(1), 267.

Mayer, R. E. (2009). Multimedia learning (2nd ed.). New York, NY: Cambridge University Press.

Muncy, J. A. (2014). Blogging for reflection: The use of online journals to engage students in reflective learning. Marketing Education Review, 24(2), 101-114. doi:10.2753/MER1052-8008240202

Tanner, K. D. (2012). Promoting student metacognition. CBE-Life Sciences Education11(2), 113-120.


Prompted Written Reflection as a Tool for Metacognition: Applying Theory in Feedback

Dr. Phani Radhakrishnan & Emma Kerr
Management Department, University of Toronto
Contact: phani.radhakrishnan@utoronto.ca & emma.kerr@mail.utoronto.ca

Background/Motivation

Understanding how to seek feedback is a core topic in the curriculum in leadership courses. However, not all feedback-seeking activities are effective. The purpose of this activity is to help students to apply empirical research about feedback to their experiences in receiving and interpreting feedback. We hoped the activity would enable them to understand how to seek and interpret feedback but to also learn about how to apply it to their own learning, and thus, encourage metacognitive thinking.

Method

This activity took place approximately midway through the semester in a third-year mandatory course for students in the business administration program. There were approximately 40 students in each class. Students were introduced to the rationale for the activity by reading DeNisi and Kluger’s (2000) review article about the relation between feedback and performance. Then they answered questions requiring them to explain the key concepts in the reading and apply the theory to a real-life example (see Appendix A). Then students listened to a short lecture that explained the theory behind the factors that increase and decrease the effectiveness of feedback. The lecture focused on the finding that the way people seek feedback can have an impact on their subsequent performance. If people seek task-based feedback by looking for the correct answer or, by asking for ways to improve their answer by focusing on how to learn the task, their subsequent performance on the task will improve. But if they seek self-based feedback by looking for information on how they did they did relative to others, by seeking information for the class average their subsequent performance on the task will not improve.

Then, students reflected on how they could apply this knowledge to their own lives. They wrote a response to the following question: “Consider a situation where you got feedback and that did not help you improve your subsequent performance. Explain why the feedback was ineffective in terms of task, task learning or self-focus. What could you have done to increase your focus on task, or task learning and decrease attention to self?” Then, we gave students examples of learning goals (see Appendix B) and asked them to write learning goals with reference to the example they just wrote about. Finally, students who volunteered, read their written reflections out loud to the rest of the class. Students were graded for participation for the homework activity as well as for their participation the in-class discussion. Finally, students were asked a similar question on the final exam about feedback they received in this course. Specifically, they had to reflect on the feedback they received from an assignment in the course where they were evaluated on argumentation, definitional, data analytical, and descriptive skills and to set a learning goal on how to improve themselves on these skills. We hoped that these multiple writing prompts would encourage students to apply research about feedback to their own experiences with receiving and seeking feedback. Such written reflections should encourage students to learn about how they may help or hurt themselves by the kind of information they focus on when asking for feedback from instructors (e.g., asking for the class average vs. asking for how to improve, or asking for the correct answer).

Our activity was guided by research which suggests that providing a written prompt that encourages students to critically think about their own experiences can encourage metacognition. For example, Ratto-Parks (2015) asked first-year college students to think about a rhetorical story assignment they had completed in a course and reflect on what they did well and where they could improve. She found that student reflections improved metacognition and strengthened writing quality. Just as Ratto-Parks’ activity encouraged students to reflect on, and thus improve their writing skills, we hoped that our questions guided students on the kinds of information they should focus on while seeking feedback and also encouraged students to meta-cognize about their experiences with feedback, and to reflect on how to use feedback-seeking opportunities to improve themselves. Similarly, other studies have found that the content of written reflections that prompt critical reflection can elicit metacognitive processes (Erksine, 2009; Harten 2014; Lew & Schmidt 2010).

Further, feedback itself has also been shown to improve metacognition (Callender, 2016). In our activity, we evaluated students on multiple skills in their course-related writing assignment (e.g., argumentation, definitional skills etc.) and then asked students to reflect on what that feedback means. By asking students to relate information learned in the course to their past feedback-seeking experiences and by providing opportunities to apply that knowledge while they are getting feedback in the course, we think we are helping students to improve their metacognitive skills since they are using both written reflections and feedback as tools to develop such skills. Taken together, the in-class writing exercise, an explanation of the theory behind feedback, an opportunity to get feedback, and answering a question on the final exam about that feedback should all improve meta-cognitive skills. This is also predicted by past research cited above.

Outcomes

Preliminary analysis shows that highly engaged students (i.e., those that read the article, answered the homework questions, wrote a reflection and participated in class discussion) tended to achieve higher marks on the related final exam question. Overall, students showed an improved understanding of effective feedback following the in-class activity. We are motivated to continue to systematically analyze student responses to the initial in-class reflection questions and to the final exam questions. We hope to detect metacognitive thinking by using Ratto-Parks’ Index of Metacognitive Knowledge in Critical

Reflective Writing, which shows promise in translating metacognitive language into identifiable traits that can be used to assess students’ reflections (2015). This analysis could be challenging because our activity consists of only one in-class reflection question based on prior feedback-seeking experiences and one final exam question based on a feedback-seeking experience in the course itself. Most studies include multiple written reflections. To detect improvement in metacognition we may need to encourage students to repeatedly answer questions about what they are learning in multiple feedback contexts. This is similar to our prior research (Radhakrishnan, Arrow, & Sniezek, 1996), which shows that asking students to repeatedly evaluate their performance over multiple tests after receiving feedback on each test improves the accuracy of their evaluations. Improving students’ understanding of what they are learning, that is, their meta-cognitive skills, may also follow a similar mechanism. Multiple written reflections about how to interpret feedback while getting feedback on multiple tasks can not only help students gain an improved understanding of the theory of feedback but also about themselves.

We expect that both the improved understanding of effective feedback processes as well as the opportunity to practice metacognition will help students to interpret and give feedback more effectively both within and outside of the course. Since our students are in the management discipline, seeking feedback effectively is a skill that is essential to their professional development as leaders. In addition, we predict that the improved experience with metacognitive processes will aid them in thinking critically and interpreting feedback in their other courses as well.

References

DeNisi, A. S., & Kluger, A. N. (2000). Feedback effectiveness: Can 360-degree appraisals be improved? Academy of Management Perspectives, 14(1), 129-139. doi:10.5465/ame.2000.2909845

Erskine, D. L. (2009). Effect of prompted reflection and metacognitive skill instruction on university freshmen’s use of metacognition (Doctoral dissertation). Retrieved from ProQuest Dissertations and Theses database.

Harten, M. D. (2014). An evaluation of the effectiveness of written reflection to improve high school students’ metacognitive knowledge and strategies (Doctoral dissertation). Retrieved from ProQuest Dissertations and Theses database.

Lew, D. N., & Schmidt, H. G. (2011). Writing to learn: can reflection journals be used to promote self-reflection and learning? Higher Education Research & Development, 30(4), 519-532. doi:10.1080/07294360.2010.512627

Radhakrishnan, P., Arrow, H., & Sniezek, J. A. (1996). Hoping, Performing, learning, and Predicting: Changes in the Accuracy of Self-Evaluations of Performance. Human Performance, 9(1), 23-49. doi:10.1207/s15327043hup0901_2

Ratto Parks, A. E. (2015). The power of critical reflection: Exploring the impact of rhetorical stories on metacognition in first year composition stories (Doctoral dissertation).

*

Appendix A—Homework Question

The way in which feedback is given can draw one’s attention to oneself and this attention to self leads to negative effects on subsequent performance (after the feedback). However, it also discusses conditions where feedback focused on the self, may not necessarily lead to negative effects. Explain how this occurs.

Use the concept of ought vs. ideal self, and promotion vs. prevention focus. Illustrate how this theory occurs by applying it in a concrete real-life situation or example.

*

Appendix B—Examples of Learning Goals

(The following is displayed on a slide in lecture to aid students in developing their own learning goals)

For a professor…

  • Finding specific ways to explain complex material in memorable ways
  • Explain concepts by giving examples & counter examples
  • Explain theories by giving concrete examples of process of how it works
  • Show the relevance of the subject matter to the students’ lives outside the classroom

For a golfer…

  • Mastering the proper grip of the club
  • Master proper placement of the feet
  • Learning when to use what club
  • Understanding the distribution of weight from one foot to the other when swinging the club

Practicing Metacognitive Awareness with Guided Lecture Notes

Dr. Terrell Hooper
Assistant Professor of Music
American University of Sharjah
Email | thooper@aus.edu

Background/Motivation:

I teach an Elements of Music course for music minors and a general populous of engineering, business and architecture students needing to earn a general arts credit. I have experienced many challenges in teaching such a course in the Middle East where students have never been exposed to any elements of western music or history. The course surveys the entire gamut of western music and history, while simultaneously giving a foundational understanding of music literacy. Given the vast parameters of the course, students are expected to have strong independent study skills. While study habits are primarily individual and differ with each student, I found students not prepared or equipped with basic study skills required to be successful in the course. The most basic skill that was lacking was the ability to take notes or organize the material being discussed in class. In addition, student feedback on end of course evaluations revealed that information and material discussed in class was so unfamiliar and vast that students did not know how to organize or digest the information. From the gathered data, I inferred that students needed a note-taking model, an opportunity to take notes on their own volition, and a moment to reflect on their note taking abilities. By implementing the aforementioned objectives, I wanted to observe whether or not said objectives would encourage students to think in a metacognitive manner and would perhaps be awakened to the importance of metacognitive practices regarding their own study habits.

Method:

Due to the pedagogical “bumps” that I experienced in my first semester of teaching Elements of Music, I decided to create a sequential curriculum (see Figure 1) that would provide students with guided lecture notes[1]. The purpose of these notes were two-fold: 1) help students structure information being discussed during class 2) help students remember, reflect, and re-organize course content during independent study. The sequential curriculum was in line with the syllabus and students were not educated about note taking skills, but were merely provided with guided lecture notes that I prepared prior to each class meeting.

Three guided lecture notes were given over a three-week period (see Appendix A). Each week the guided lecture notes were designed to incorporate a progressive guide for helping students become more metacognitive aware of proper note taking habits during in-class lectures. The first guided lecture notes were designed to orient students to the process of taking notes in an outline format and contained fill in the blank areas that were curated throughout the outline. Subsequent guided lecture notes included reflective questions at the end of the lecture. Lecture 2 contained recall questions and Lecture 3 contained essay questions concerning content that was discussed during the lecture. All three guided lecture notes were collected after each class and data recorded on how many students completed the entire handout and rated on its overall completion (i.e number of blanks left on the handout). Lecture 4 (the Classical Lecture) did not use guided lecture notes and no instruction or requirements for note taking was given to students because I wanted to observe how many students saw the need to take notes of their own volition.

Following the review session (see Figure 1) a midterm exam was administered. The midterm exam consisted of multiple choice, fill in the blank, and true or false questions and were copied verbatim from the previous semester exam so data could be compared with how students not exposed to guided lecture notes scored on the same questions. After the midterm exam, students were given a survey via Google Forms and were asked questions regarding the usefulness of the guided lecture notes. Finally, I gave a ten-minute lecture that informed students on the data gathered in the questionnaire, statistics on how many students completed each handout during each lecture, and the exam scores from students who used guided lecture notes with students who did not use guided lecture notes in the Fall semester.

Figure 1. Sequential Curriculum for Guided Lecture Notes

                        Data Outcomes:

All students enrolled in Elements of Music for Spring semester participated in the study (n=29), however, due to random class absences, Lecture 1 had 27 participants, Lecture 2 had 28 participants and Lecture 3 had 25 participants. A set of 30 questions derived directly from the lecture notes were used on the midterm exam for students in Spring semester (n=29) and the final exam for students from Fall semester (n=28). Each exam question (n=30) was scored as correct or incorrect on both Fall and Spring student exams and the total number of incorrect answers was calculated for each student. An independent t-test revealed no significant difference between groups, t(53)=1.02, p=.31; Mean (Std Dev) Fall Semester = 4.9 (3.4) and Mean (Std Dev) Spring Semester = 4.0 (2.9).

On a more positive note regarding the incorporation of the guided lecture notes, students who participated in the questionnaire (n=23) gave strongly positive ratings for the notes. They rated their overall satisfaction on a 3-point Likert scale choosing between unsatisfactory, satisfactory continuum and extremely satisfactory. Results indicated 69.6% (n=15) of students surveyed were extremely satisfied with using guided lecture notes and 30.4% (n=7) of students chose the middle option, indicating neither unsatisfied nor extremely satisfied. Open-ended student feedback on using teacher guided lecture notes is represented in Table 1.

Table 1.Student Feedback Using Guided Lecture Notes

Pros Cons
“Provides important details and helps us focus on what is more important” “More detailed questions”
“Guides me through the chapters while studying from the book” “Sometimes the questions are vague and need clarification”
“They were a very good guide when it came to studying for midterms as they summarized the main concepts“ “Include a list of keywords”
“These outlines make it easier to understand and absorb the material faster” “The information was a lot and we didn’t have enough time to complete it during class while the professor was explaining it. Sometimes I felt I couldn’t keep up the pace while listening to the lecture and writing thus I left many blanks to fill in later which made me unsure of my answers.”

 Observations:

The primary purpose of this research study was to 1) help students structure information being discussed during class 2) help students remember, reflect, and re-organize course content during independent study. The study illuminated the fact that when students organize, reflect, and collaborate with their teachers on their own learning it improves the pedagogical process. Although the data does not necessarily confirm that guided lecture notes improves test scores, it would be remiss to not acknowledge that students do enjoy being provided with a structure for organizing the information presented during lectures. In addition, no negative feedback concerning the amount of material or organizational components of the course were received on end-of-course student evaluations.

The intent of helping students take personal initiative on using guided lecture notes in Lecture 4 (see Figure 1) and giving an informative ten-minute lecture on the possible gains of using such an organizational scheme when listening to class lectures was to help students to think more about their own study skills. However, generally speaking, I did not observe a change in the majority of classroom behavior with students beginning to practice metacognition regarding their own study habits. I actually observed students wanting or expecting the guided lecture notes for every class. The end-of-course student evaluations even noted that students wanted guided lecture notes for each class lecture. Even though students positively reflected on the usefulness of the guided lecture notes, I observed a disconnect in motivating students to take personal initiative for their personal study habits. Future research should investigate the link between in-class lectures and how students become more self-directed within their own independent study habits.

[1] Guided notes are defined as “teacher-prepared handouts that ‘guide’ a student through a lecture with standard cues and prepared space in which to write the key facts, concepts, and/or relationships” (Heward, 1994, p. 304).


Supporting Student Self-Assessment with Knowledge Surveys

by Dr. Lauren Scharff, U. S. Air Force Academy*

In my earlier post this year, “Know Cubed” – How do students know if they know what they need to know?, I introduced three challenges for accurate student self-assessment. I also introduced the idea of incorporating knowledge surveys as a tool to support student self-assessment (an aspect of metacognitive learning) and promote metacognitive instruction. This post shares my first foray into the use of knowledge surveys.

What exactly are knowledge surveys? They are collections of questions that support student self-assessment of their course material understanding and related skills. Students complete the questions either at the beginning of the semester or prior to each unit of the course (pre), and then also immediately prior to exams (post-unit instruction). When answering the questions, students rate themselves on their ability to answer the question (similar to a confidence rating) rather than fully answering the question. The type of learning expectation is highlighted by including the Bloom’s level at the end of each question. Completion of knowledge surveys develops metacognitive awareness of learning and can help guide more efficient studying.

Example knowledge survey questions
Example knowledge survey questions

My motivation to include knowledge surveys in my course was a result of a presentation by Dr. Karl Wirth, who was invited to be the keynote speaker at the annual SoTL Forum we hold at my institution, the United States Air Force Academy. He shared compelling data and anecdotes about his incorporation of knowledge surveys into his geosciences course. His talk inspired several of us to try out knowledge surveys in our courses this spring.

So, after a semester, what do I think about knowledge surveys? How did my students respond?

In a nutshell, I am convinced that knowledge surveys enhanced student learning and promoted student metacognition about their learning. Their use provided additional opportunities to discuss the science of learning and helped focus learning efforts. But, there were also some important lessons learned that I will use to modify how I incorporate knowledge surveys in the future.

Evidence that knowledge surveys were beneficial:

My personal observations included the following, with increasing levels of each as the semester went on and students learned how to learn using the knowledge survey questions:

  • Students directly told me how much they liked and appreciated the knowledge survey questions. There is a lot of unfamiliar and challenging content in this upper-level course, so the knowledge survey questions served as an effective road map to help guide student learning efforts.
  • Students asked questions in class directly related to the knowledge survey questions (as well as other questions). Because I was clear about what I wanted them to learn, they were able to judge if they had solid understanding of those concepts and ask questions while we were discussing the topics.
  • Students came to office hours to ask questions, and were able to more clearly articulate what they did and did not understand prior to the exams when asking for further clarifications.
  • Students realized that they needed to study differently for the questions at different Bloom’s levels of learning. “Explain” questions required more than basic memorization of the terms related to those questions. I took class time to suggest and reinforce the use of more effective learning strategies and several students reported increasing success and the use of those strategies for other courses (yay!).
  • Overall, students became more accurate in assessing their understanding of the material prior to the exam. More specifically, when I compared the knowledge survey reports with actual exam performance, students progressively became more accurate across the semester. I think some of this increase in accuracy was due to the changes stated in points above.

Student feedback included the following:

  • End-of-semester feedback from students indicated that vast majority of them thought the knowledge surveys supported their learning, with half of them giving them the highest rating of “definitely supports learning, keep as is.”
  • Optional reflection feedback suggested development of learning skills related to the use of the knowledge surveys and perceived value for their use. The following quote was typical of many students:

At first, I was not sure how the knowledge surveys were going to help me. The first time I went through them I did not know many of the questions and I assumed they were things I was already supposed to know. However, after we went over their purpose in class my view of them changed. As I read through the readings, I focused on the portions that answered the knowledge survey questions. If I could not find an answer or felt like I did not accurately answer the question, I bolded that question and brought it up in class. Before the GR, I go back through a blank knowledge survey and try to answer each question by myself. I then use this to compare to the actual answers to see what I actually need to study. Before the first GR I did not do this. However, for the second GR I did and I did much better.

Other Observations and Lessons learned:

Although I am generally pleased with my first foray into incorporating knowledge surveys, I did learn some lessons and I will make some modifications next time.

  • The biggest lesson is that I need to take even more time to explain knowledge surveys, how students should use them to guide their learning, and how I use them as an instructor to tailor my teaching.

What did I do this past semester? I explained knowledge surveys on the syllabus and verbally at the beginning of the semester. I gave periodic general reminders and included a slide in each lesson’s PPT that listed the relevant knowledge survey questions. I gave points for completion of the knowledge surveys to increase the perception of their value. I also included instructions about how to use them at the start of each knowledge survey:

Knowledge survey instructions
Knowledge survey instructions

Despite all these efforts, feedback and performance indicated that many students really didn’t understand the purpose of knowledge surveys or take them seriously until after the first exam (and some even later than that). What will I do in the future? In addition to the above, I will make more explicit connections during the lesson and as students engage in learning activities and demonstrations. I will ask students to share how they would explain certain concepts using the results of their activities and the other data that were presented during the lesson. The latter will provide explicit examples of what would (or would not) be considered a complete answer for the “explain” questions in contrast to the “remember” questions.

  • The biggest student feedback suggestion for modification of the knowledge surveys pertained to the “pre” knowledge surveys given at the start of each unit. Students reported they didn’t know most of the answers and felt like completion of the pre knowledge surveys was less useful. As an instructor, those “pre” responses helped me get a pulse on their level or prior knowledge and use that to tailor my lessons. Thus, I need to better communicate my use of those “pre” results because no one likes to take time to do what they perceive is “busy work.”
  • I also learned that students created a shared GoogleDoc where they would insert answers to the knowledge survey questions. I am all for students helping each other learn, and I encourage them to quiz each other so they can talk out the answers rather than simply re-reading their notes. However, it became apparent when students came in for office hours that the shared “answers” to the questions were not always correct and were sometimes incomplete. This was especially true for the higher-level questions. I personally was not a member of the shared document, so I did not check their answers in that document. In the future, I will earlier and more explicitly encourage students to be aware of the type of learning being targeted and the type of responses needed for each level, and encourage them to critically evaluate the answers being entered into such a shared document.

In sum, as an avid supporter of metacognitive learning and metacognitive instruction, I believe that knowledge surveys are a great tool for supporting both student and faculty awareness of learning, the first step in metacognition. We then should use that awareness to make necessary adjustments to our efforts – the other half of a continuous cycle that leads to increased student success.

———————————————–

* Disclaimer: The views expressed in this document are those of the author and do not reflect the official policy or position of the U. S. Air Force, Department of Defense, or the U. S. Govt.


How to Get the Most Out of Studying

Dr. Stephen Chew has put together a highly lauded series of short videos that share with students some powerful principles of effective learning, including metacognition. His goal was to create a resource that students can view whenever and as often as they want.

They include

  • Video 1: Beliefs That Make You Fail…Or Succeed
  • Video 2: What Students Should Understand About How People Learn
  • Video 3: Cognitive Principles for Optimizing Learning
  • Video 4: Putting the Principles for Optimizing Learning into Practice
  • Video 5: I Blew the Exam, Now What?

Links to the videos can be found here:

https://www.samford.edu/departments/academic-success-center/how-to-study

Dr. Chew also provides an overview handout that summarizes the purposes of the videos, gives guidance on how to use them, and outlines the main points within the videos:

https://www.samford.edu/departments/files/Academic_Success_Center/How-to-Study-Teaching_Resources.pdf


Developing Metacognition with Student Learning Portfolios

In this IDEA paper #44, The Learning Portfolio: A Powerful Idea for Significant Learning, Dr. John Zubizarreta shares models and guidance for incorporating learning portfolios. He also makes powerful arguments regarding the ability of portfolios to engage students in meaningful reflection about their learning, which in turn will support a metacognitive development and life-long learning.

 


“Know Cubed” – How do students know if they know what they need to know?

by Dr. Lauren Scharff, U. S. Air Force Academy*


Know Cubed

This simple but somewhat of a tongue-twister question takes us to several challenging aspects of teaching and learning that link to both student and instructor metacognition:

  1. How do students self-assess their understanding and abilities prior to assessments?
  2. Are students able to accurately know what they are expected to be able to demonstrate for an assessment?
  3. What can we as instructors reasonably do to be transparent regarding our learning expectations and to support student development of accurate self-assessment?

Generally speaking, humans ARE good at self-assessment, as long as the self-assessment activity/tool is well-aligned with the actual assessment activity/tool (e.g. see Nuhfer, 2015). However, there are many possible reasons why students may not accurately self-assess, and several of those are directly under our control as instructors.

Thus, I believe we should engage in metacognitive instruction by developing our awareness of common reasons that students may not accurately self-assess, what we might be doing that inadvertently leads to those pitfalls, and some means by which we can support more accurate student self-assessment. We should then intentionally use that awareness to adjust what we do. This combination of awareness and self-regulation provides the foundation for metacognitive instruction.

Based on my observations and discussions with colleagues across the years, here are three common reasons students might not accurately self-assess along with some strategies instructors might take to support better student self-assessment:

  1. Lesson-to-Exam Misalignment – For example, classroom instruction and activities sometimes focus on basic concepts and definitions, while exams ask for evaluation and synthesis. Students may self-assess their competency based on what was presented in the lesson, but then feel surprised and perform poorly on the exam when they are asked to go beyond the lower level. Even if instructors “warn” students that they will need to engage in higher-level processing on the exams, if students haven’t been given the opportunity to experience what that means and practice it, those students may not accurately self-assess their preparedness for the exam. Instructors should analyze the levels and types of learning materials they present in class and require of students during formative learning activities (in-class activities, homework, quizzes). Then, they should align their exams to have similar levels of expectation. If they desire higher-level learning to be demonstrated on exams, they should redesign their learning activities to allow scaffolding, practice, and feedback with those higher-level expectations.
  2. Confusing Questions – Students often claim that questions on exams are confusing, even if they don’t seem to be confusing from the instructor’s perspective. Thus, students might actually be accurate in their self-assessment of their understanding of a topic, but then fail to demonstrate it because they were confused by the question or simply misread it. Test anxiety can add additional cognitive load and make it more likely for students to misread questions. Thus, instructors should review their questions to find ways to more clearly indicate what they expect in a response. For example, if there are two parts to the question, rather than having a long question, break it into part (a) and part (b). This symbolism clearly communicates that a good response should have two parts. It often can be difficult for the person writing the question to assess the clarity of their question because they know what they mean, so it seems obvious. (Instructors can also fall into this trap when reviewing test banks questions and the correct answer is clearly indicated. Once the answer is known, it seems obvious.) Being aware of these pitfalls and taking the time to critically analyze one’s test questions is a good way to engage in metacognitive instruction. Having a colleague from a different area of expertise read through the questions before finalizing them can also help catch some instances where clarity could be improved.
  3. Smooth Presentations – Instructors are experts, and they generally like to be perceived as such. Thus, it is far more common for instructors to present problem work-outs or other complex material in ways that make it look smooth and easy. That seems good, right? Actually, smooth presentations can mislead students into thinking that the material is easy and not prompt them to ask questions. Following a smooth presentation, students might then self-assess as understanding the material when really they would not be able to work out a problem on their own. Explicit step-by-step examples in textbooks also sometimes fool students into thinking they know how to workout problems if the assigned homework can be completed by following the examples. Instructors should consider verbalizing points of possible confusion that they know often catch students or sharing their own struggles as they learned the material in the past. As they work out problems in front of class, they could ask what worked, what didn’t, and what changes could be made in the problem-solving approach (or writing approach, or presentation of an argument, etc.). They should also emphasize to students that they will be better able to self-assess their preparation for an exam if they work out problems without the examples in front of them.

The above challenges for accurate student self-assessment and instructor strategies to address them are just a start to help us become metacognitive instructors and help students become more metacognitive learners. In my next post I will share with you my recent exploration into the use of Knowledge Surveys. This tool directly helps students develop more accurate self-assessment. Further, with direction and encouragement from the instructor, knowledge surveys can help students become metacognitive learners by using their awareness of their learning to guide their use of learning strategies.

There are many routes to becoming a metacognitive instructor, although all require intentionality in developing awareness of factors impacting student learning and using that awareness to self-regulate instructional efforts. It is a process with many options and possible strategies, where even small efforts can lead to big pay-offs in student learning and development.

———–

Nuhfer, E. (January 2015). Self-assessment and the Affective Quality of Metacognition: Part 2 of 2. Blog post on Improve with Metacognition, retrieved from https://www.improvewithmetacognition.com/self-assessment-and-the-affective-quality-of-metacognition-part-2-of-2/

* Disclaimer: The views expressed in this document are those of the author and do not reflect the official policy or position of the U. S. Air Force, Department of Defense, or the U. S. Govt.


Developmental Framework for Teaching Expertise

A group of faculty at the University of Calgary share a framework for growth of teaching expertise that demonstrates that “teaching expertise involves multiple facets, habits of mind (or ways of knowing and being), and possible developmental activities.” They share this framework with the hope that others will share, adapt and use it in their own local contexts. The full paper is also available. Note that they also refer to it as a “framework for self-reflection” for faculty, which means it can be used to support metacognitive instruction.

 

Developing a Learning Culture: A Framework for the Growth of Teaching Expertise

 


It shouldn’t be Top Secret – Bloom’s Taxonomy

By Lauren Scharff, Ph.D.,  U. S. Air Force Academy *

Across the past year or so I have been reminded several times of the following fact: Most students are not aware of Bloom’s Taxonomy, and even if they are aware, they have no clue how or why their awareness of it might benefit them and their learning. Most instructors have heard of at least one version of Bloom’s Taxonomy, and some keep it in mind when designing learning activities and assessments.  But, rarely do instructors even mention it to their students.

Why don’t instructors share Bloom’s Taxonomy with their students? Is it a top secret, for instructors only? No! In fact, awareness and use of Bloom’s taxonomy can support metacognitive learning, so students should be let in on the “secret.”

What were the key experiences that led me to this strong stance? Let me share….

In May of 2016, I was fortunate to attend a keynote by Dr. Saundra McGuire at High Point University. In her keynote address and in her book, Teach Students How to Learn (2015), McGuire shared stories of interactions with students as they became aware of Bloom’s Taxonomy and applied it to their learning. She also shared data showing how this coupled with a variety of other metacognitive strategies lead to large increases in student academic success. Her work served as the first “ah ha” moment for me, and I realized that I needed to start more explicitly discussing Bloom’s Taxonomy with my students.

An additional way to highlight Bloom’s Taxonomy and support student metacognitive learning was shared this past October (2017) when Dr. Karl Wirth led a workshop as part of our 9th Annual Scholarship of Teaching and Learning (SoTL) Forum at the U. S. Air Force Academy. In his workshop he shared examples of knowledge surveys along with data supporting their use as a powerful learning tool. Knowledge surveys are collections of questions that support student self-assessment of their knowledge, understanding, and skills. When answering the questions, students rate themselves on their ability to answer the question (similar to a confidence rating) rather than fully answering the question. Research shows that most students are able to accurately self-assess (confidence ratings correlate strongly with actual performance; Nuhfer, Fleisher, Cogan, & Gaze, 2017). However, most students do not take the time to carefully self-assess their knowledge and abilities without formal guidance and encouragement to do so. In order to be effective, knowledge surveys need to ask targeted / granular questions rather than global questions. Importantly, knowledge survey questions can span the full range of Bloom’s Taxonomy, and Dr. Wirth incorporates best practices by taking the time to explain Bloom’s Taxonomy to his students and explicitly share how his knowledge survey questions target different levels.

Sharing Bloom’s Taxonomy in our classes is a great first step, but ultimately, we hope that students use the taxonomy on their own, applying it to assignments across all their courses. However, just telling them about the taxonomy or explaining how aspects of our course tap into different levels of the taxonomy may not be enough to support their use of the taxonomy beyond our classrooms. In response to this need, and as part of an ongoing Scholarship of Teaching and Learning (SoTL) project at my institution, one of my student co-investigators (Leslie Perez, graduated May 2017), created a workshop handout that walks students through a series of questions that help them apply Bloom’s as a guide for their learning and academic efforts. This handout was also printed in a larger, poster format and is now displayed in the student dorms and the library. Students use the handout by starting in the middle and asking themselves questions about their assignments. Based on their answers, the walk through a path that helps them determine what level of Bloom’s Taxonomy they likely need to target for that assignment. It should help them become more explicitly aware of the learning expectations for their various assignments and support their informed selection of learning strategies, i.e. help them engage in metacognitive learning.

Figure 1. Snapshot of the handout we use to guide students in applying Bloom’s Taxonomy to their learning.  (full-sized version here)

As someone who is a strong proponent of metacognitive learning, I have become increasingly convinced that instructors should more often and more explicitly share this taxonomy, and perhaps even more importantly, share how it can be applied by students to raise their awareness of learning expectations for different assignments and guide their choice of learning strategies. I hope this post motivates instructors to share Bloom’s Taxonomy (and other science of learning information) with their students. Feel welcome to use the handout we created.

————

McGuire, S. (2015). Teach Students How to Learn. Stylus Publishing, LLC, Sterling, VA.

Nuhfer, E., Fleisher, S., Cogan, C., Wirth, K., & Gaze, E. (2017). How random noise and a graphical convention subverted behavioral scientists’explanations of self-assessment data: Numeracy underlies better alternatives. Numeracy, 10(1), Article 4. DOI: http://dx.doi.org/10.5038/1936-4660.10.1.4

* Disclaimer: The views expressed in this document are those of the author and do not reflect the official policy or position of the U. S. Air Force, Department of Defense, or the U. S. Govt.


Mind Mapping: A Technique for Metacognition

by Charlie Sweet, Hal Blythe, Rusty Carpenter, Eastern Kentucky University  Downloadable

Background

The Provost at Eastern Kentucky University invited Saundra McGuire to speak on metacognition as part of our University’s Provost’s Professional Development Speaker Series. Our unit was tasked with designing related programming both before and after McGuire’s visit.   Our aim was to provide a series of effective workshops that prepared the ground for our university’s Quality Enhancement Plan 2.0 on metacognition as a cross-disciplinary tool for cultivating reading skills. The following mind mapping exercise from one of four workshops was taught to over 50 faculty from across campus and the academic ranks. Feedback rated its popularity high and suggested its appropriateness for any level of any discipline with any size class.

Scientific Rationale

The Mind Map, a term invented by Tony Buzan in The Mind Map Book (1993), “is a powerful graphic technique which provides a universal key to unlocking the potential of the brain” (9). For that reason, Buzan’s subtitle is How to Use Radiant Thinking to Maximize Your Brain’s Untapped Potential. A mind map provides a way for organizing ideas either as they emerge or after the fact. Perhaps the mind map’s greatest strength lies in its appeal to the visual sense.

We chose to share mind mapping with our faculty because according to Brain Rules (2008), rule number ten is “Vision trumps all other senses” (221). For proof, the author, John Medina, cites a key fact: “If information is presented orally, people remember about 10%, tested 72 hours after exposure. That figure goes up to 65% if you add a picture” (234). Because of its visual nature, mind mapping provides a valuable metacognitive tool.

How Mind Mapping Supports Metacognition

Silver (2013) focuses on reflection in general and in particular “the moment of meta in metacognition—that is the moment of standing above or apart from oneself, so to speak, in order to turn one’s attention back upon one’s own mental work” (1). Mind mapping allows thinkers a visual-verbal way to delineate that moment of reflection and in capturing that moment to preserve its structure. Because analysis is one of Bloom’s higher-order learning skills, mind mapping leads to deep thinking, which makes self-regulation easier.

Method

Essentially, a mind map begins with what Gerry Nosich in Learning to Think Things Through (2009) calls a fundamental and powerful concept, “one that can be used to explain or think out a huge body of questions, problems, information, and situations” (105). To create a mind map, place the fundamental and powerful concept (FPC) you wish to explore in the center of a piece of paper and circle it. If at all possible, do something with color or the actual lettering in order to make the FPC even more visual. For instance, if you were to map the major strategies involved in metacognition, metacognition is the FPC, and you might choose to write it as such:

M E T A
Cognition

Increasing the visual effect of the FPC are lines that run to additional circled concepts that support the FPC. These Sputnik-like appendages are what Buzan calls basic ordering ideas, “key concepts within which a host of other concepts can be organized” (p. 84). For example, if you were working with our metacognition example, your lines might radiate out to a host of also-circled metacognitive strategies, such as retrieving, reflection, exam wrappers, growth mindset, and the EIAG process of Event selection-Identification of what happened-Analysis-Generalization of how the present forms future practice (for a fuller explanation see our It Works for Me, Metacognitively, pp. 33-34). And if you wanted to go one step further, you might radiate lines from, for instance, retrieving, to actual retrieving strategies (e.g., flashcards, interleaving, self-quizzing).

Uses for Mind Maps

Mind mapping has many uses for both students and faculty:

  • Notetaking: mind mapping provides an alternative form of notetaking whether for students or professors participating in committee meetings. It can be done before a class session by the professor, during the session by the student, or afterwards as a way of checking whether the fundamental and powerful concept(s) was taught or understood.
  • Studying: instead of rereading notes taken, a method destined for failure, try reorganizing them into a mind map or two. Mind mapping not only offers the visual alternative here, but provides retrieval practice, another metacognitive technique.
  • Assessing: instead of giving a traditional quiz at the start of class or five-minute paper at the end, ask students to produce a mind map of concept X covered in class. This alternative experiment will demonstrate to students a different approach and place another tool in their metacognitive toolbox.
  • Prioritizing: when items are placed in a mind map, something has to occupy center stage. Lesser items are contained in the radii.

Outcomes

Mind maps are easy, deceptively simplistic, fun, and produce a deep learning experience. Don’t believe it? Stop reading now, take out a piece of paper, and mind map what you just read. We’re willing to bet that if you do so, the result will provide a reflection moment.

References

Buzan, T. (1993). The mind map book: How to use radiant thinking to maximize your brain’s untapped potential. New York: Plume Penguin.

McGuire, S. Y., & McGuire, S. (2015). Teach students how to learn: Strategies you can incorporate into any course to improve student metacognition, study skills, and motivation. Sterling, VA: Stylus.

Medina, J. (2008). Brain rules. Seattle: Pear Press.

Nosich, J. (2009). Learning to think things through. Upper Saddle River, NJ: Pearson.

Silver, N. (2013). Reflective pedagogies and the metacognitive turn in college teaching.

In M. Kaplan, N. Silver, D. Lavaque-Manty, & D. Meizlish (Eds.), Using reflection and metacognition to improve student learning (pp. 1-17). Sterling, VA: Stylus.

Sweet, C., Blythe, H., & Carpenter, R. (2016). It Works for Me, Metacognitively. Stillwater, OK: New Forums.

Appendix: How to Use Word to Create a Mind Map

  1. Click Insert.
  2. Click Shapes and select Circle.
  3. Click on desired position, and the circle will appear.
  4. Click on Draw Textbox.
  5. Type desired words in textbox (you may have to enlarge the textbox to accommodate words).
  6. Drag textbox into center of circle.
  7. Repeat as desired.
  8. To connect circles, click Insert Shapes and then Select Line.
  9. Drag Line between circles.

A ‘New Ear’ for Student Writers: Building Awareness of Audience

by Michael Young, Robert Morris University

 Downloadable

Motivation and Background:

A fundamental hurdle for most inexperienced writers is gaining a sense of their audience, and how a different consciousness may interpret the words, the organization, and the presentation that they (the writers) use to share ideas. It is different than knowing rules, techniques, or traditions of writing. It requires more than knowledge of the topic about which they are writing. Writers must be aware of their own individual thinking, their own choices, their motivations, and how these could be interpreted or misinterpreted by other people’s ways of thinking. This need for awareness of their own thoughts that could then support their writing efforts, i.e. metacognitive writing, led me to develop a new pedagogical process for the writing classroom that uses active presentations by others to convey audience interpretation.

I used this process for three years in creative writing courses, partially because students were already pursuing genres that often are interpreted orally, but believe it could be applicable to any writing course, especially with the following course characteristics: 1) upper division/at least sophomore level so the students are already somewhat experienced collegiate writers and 2) class size is small, ideally 20 or fewer students. No special materials, other than imagination and the means to convey ideas, are needed for the in-class exercises.

Nuts and Bolts:

This pedagogical process has several steps. To first prepare the students and get them thinking about how an audience might interpret their work, the students are given an initial survey on their then-current process of writing and concept of their potential audience. Consistently, three out of five agreed that they had a “mental picture” of their reader, but it was often no further developed than their college peers or even themselves. Most could not describe their readers any further and some said they had not considered a concept of a readership. Perhaps, for them, they had written only and ever with the teacher, and so a grade, in mind.

The second step involves having canonical examples of their genre, fiction or poetry, interpreted by others. During this step those others give a presentation / reading of the work in a manner that conveys their interpretation of the writing. Those others can be classmates or a more external audience. For example, the first two years I used this process, the others were members of the Forensics Team from the University of Nebraska-Lincoln, then led by Professor Ann Burnett.

A third step, which has evolved over the years, was to have others present the students’ own writing back to them. This third step was implemented as a cycle. The students wrote their piece (either individually or as a group) and then gave it to others (classmates or external individuals) for interpretation with no additional input from the writers. The presenters would convey their interpretation, which then could be used by the writers to guide their revisions based on a better understanding of possible audience interpretation. If revisions were made, then the cycle of interpretation could be repeated.

Outcomes:

When this was done at the University of Nebraska-Lincoln, in a project funded by a grant from the university’s Teaching Council, 80% of the collaborative groups elected to revise their texts after hearing them interpreted. They noted the experience of hearing their stories being told by someone else, someone who was sharing their own understandings and insights into the words, heightened an awareness of qualities like the “flow and rhythm” of words or of “trying to make a picture in my head”, and an overall greater attention to what their drafts were able to communicate. For example, the potential hollowness of easy clichés might not have occurred to the writers or a lack of descriptions they had had in mind but which were not articulated were now more evident. Further, the majority of the class reported being much more aware of their own thinking (an aspect of metacognition) and the thinking of others.

By hearing, and sometimes seeing by the use of movements, how another person re-created the writer’s intentions, each writer had the opportunity to perceive how their audience understood what had been written down – in a way, to hear their own thinking – and to questions themselves. Is that what they had wanted someone else to feel, to think or had their expression fallen short of their conception? In other words, the process allowed them to “hear it (their work) with a ‘new ear’” and some of them realized they “should have found another way to get that (sic) message across.” That “new ear”, hopefully, was them more carefully listening to and questioning their own thoughts, i.e. being metacognitive about their own writing.


Participatory Pedagogy: Inviting Student Metacognition

by Nicola Simmons, Brock University, nsimmons@brocku.ca  Downloadable

Background

I teach higher and adult education, including adult developmental psychology, and like to invite my students to be aware of their cognitive processes. I see this as central to being an adult learner. One strategy I have developed is engaging students in creating course outcomes and content. I hope to help students become more aware of, more involved in, and better assessors of their own learning; in short, to examine their learning through a metacognitive lens.

This example is from a Masters of Education class, Exploring Approaches to Professional Development. The class is typically quite small (up to 20 students) but I have used it in groups of 50 students at the undergraduate level as well. 

The Approach 

The course follows Siemens’ (1984) participatory pedagogy (see syllabus excerpt) to invite students to co-construct the course process, including choosing course readings and creating grading rubrics:

As Biggs (2011) notes, student course co-ownership helps engage students in deep learning; it also builds their awareness of their learning processes. The first assignment, for example, asked them to:

Articulate your intended learning during this course, including a focus for personal and professional development. What will your development focus be? What will you do to realize your plan?

This engages students metacognitively as they take responsibility for their learning path and prepares them for the final assignment, a reflective ‘portfolio,’ in which they synthesize their learning over the term:

Create a creative and critical summary of your changing perspectives and reflections throughout the course, integrating readings (both assigned and others). Discuss your key learning, referring to course and outside experiences. Exemplary projects demonstrate critical analysis, synthesis, and self-evaluation. Can be any format (paper, song, performance, art; format negotiable). Addresses:

  • What theories help you?
  • What have you learned?
  • How can you use that?
  • How have you changed?
  • How do you know?

Each of these prompts invites consideration of the learning and development process and supports students in acquiring habits of mind that will allow them to approach future courses with a metacognitive lens. This has also led to their growth as scholars: One year, many of the students engaged in a self-study that included conducting a literature review and creating questions to guide our reflections. The result of that work was several conference presentations and a peer-reviewed paper (Simmons, Barnard, & Fennema, 2011) that outlined the transformative learning resulting from the student co-constructed course.

What was fascinating to me were the ways the course process built not only students’ metacognition about their learning, but also about their teaching. One wrote

I told my colleagues the story of this course and they were moved to consider new ways of doing culminating projects. Why isn’t there more choice? Why do we tell students what they must produce to demonstrate their own learning? Why don’t we add the additional layer of asking students to find the best way to demonstrate their learning?

Outcomes and Lessons Learned

Developing metacognition is not a pain-free process! One student described the transformation during the process from fear to increased confidence.

Activities were out of my comfort zone and there were times that I struggled with the unknown … I was able to see the value once I moved beyond the frozen fear of uncertainty to ask myself “What did I want to gain from this course? How did I learn when pushed out of my comfort zone?” I had to be transformed into a student who was open to this new concept and new territory for learning…[where] mistakes … would not be judged but instead used as stepping stones toward learning.

Instructors should be mindful of the importance of support throughout the process. Just as the students are invited to be metacognitive about their processes, it helps if the instructor is transparently metacognitive about the overall course path. For me that looked like saying things like “this may be new for you, but I’d like you to consider trying it” and reassuring them that discomfort was a sign they were onto something good!

The course format continues to unsettle students but also transform them into metacognitive learners, and I finish with one student’s illustrative words:

I remember thinking at the time that the final project was the most difficult task that I had encountered … I really had to ponder … how my journey through the course could be effectively captured and conveyed … It continues to personify my journey through work/life, the choices we make when we meet resistance or the paths we take … how we travel the road is for our choosing.

References

Biggs, J. B., & Tang, C. (2011). Teaching for quality learning at the university: What the student does. Maidenhead, UK: Society for Research into Higher Education & Open University Press.

Flavell, J. H. (1979). Metacognition and cognitive monitoring: A new area of cognitive–developmental inquiry. American Psychologist, 34(10), 906-911.

Siemens, G. (2008). New structures and spaces of learning: The systemic impact of connective knowledge, connectivism, and networked learning. Paper Presented for Universidade do Minho, Encontro sobre Web 2.0, Braga, Portugal, October 10. Available online at http://elearnspace.org/Articles/systemic_impact.htm

Simmons, N., Barnard, M., & Fennema, W. (2011). Participatory pedagogy: A compass for transformative learning? Collected Essays on Learning and Teaching, 4.


Make It Stick in Cognitive Psychology

by Jennifer A. McCabe, Goucher College,
jennifer.mccabe@goucher.edu

 Downloadable

 

Motivation and Background: I am a cognitive psychologist with a research program focused on metacognition and applied memory in education. I decided three years ago to structure my Cognitive Psychology course around the principles described in the book, Make It Stick: The Science of Successful Learning by Brown, Roediger, and McDaniel (2014). Many memory-improvement principles are discussed in this book, including: practice retrieving new learning from memory, space out your retrieval practice, interleave the study of different problem types, elaboration, and reflection. Other topics include the fluency illusion, getting past learning styles, and developing a growth mindset. Adopting this book as required reading, and structuring the course to reflect these principles, dovetailed with my increasing commitment to prompt and support students’ metacognitive growth. I hoped that this would both enhance student learning on objective tests (in a notoriously challenging course), and also explicitly support a course learning outcome: Improve your metacognitive skills (knowing what you know, learning how to learn).

Context in which the activity or process has been used: This has been included in three sections of Cognitive Psychology, a 200-level course offered at Goucher College, a small liberal arts institution in Baltimore, Maryland. The class size is 25-30 students, and I have been teaching this course for 13 years.

Description of activity or process methods: The description of the activity is in my Cognitive Psychology syllabus (available through Project Syllabus:  http://teachpsych.org/Resources/Documents/otrp/syllabi/JM16cognitive.pdf). On the first day of class, I describe the Make It Stick” Reflection Papers. For each class period in which a chapter is assigned, students prepare and bring to class a 1-page, single-spaced reflection. Content and style is open, but they must demonstrate deep and careful thinking about the topic, and explicit connections to life experiences, habits and plans/intentions, and course material. They can also include questions and/or other personal reactions to the chapter. I note that this assignment requires elaboration and reflection, two effective learning strategies discussed in the book. Students submit 8 reflection papers during the semester (one per chapter), each worth up to 5 points. Out of a 500-point class, this assignment is worth up to 40 points (8%).

The first reflection paper is due early in the semester, typically the second week, then the subsequent seven chapters/papers are due approximately once per week. We take time in class on those days to engage in small- and large-group discussion. Most of these discussions are framed in terms of metacognition, particularly in light of research suggesting that college students do not always understand how learning works, and cannot always predict which memory strategies lead to the best retention (e.g., McCabe, 2011). I encourage them to consider their lives as learners, and how they can use information from the book to adjust their strategies.

We also talk about how this course is structured to reflect “best practice” learning strategies. For example, students take a self-graded “retrieval practice” quiz at the start of most class periods, because research shows that frequent, effortful, low-stakes, cumulative, spaced (distributed) retrieval practice: (1) produces the most durable learning; and (2) improves metacognitive accuracy of what you know. I strive to be transparent in the purpose for all course elements. In a sense, then, I see Make It Stick as a framework for the entire course – core content and topics for discussion, rationale for course design, and hopefully motivation for students to engage and feel empowered in their own learning.

Outcomes and Lessons Learned:

Since implementing this assignment, I believe that students’ knowledge about effective learning strategies has improved. They seem to enjoy the book as a required course component – on an anonymous questionnaire, 88% agreed that Make It Stick should be included in future classes. When asked whether this course had supported the learning outcome of improving metacognitive skills, 100% agreed or strongly agreed (71% strongly agreed). And when asked about one way this course has changed the way they think or behave in the world, 78% included a statement relating to metacognition. Some examples include:

“I now analyze the way I am absorbing and encoding information. I have never thought about the way I learn but now I am so grateful to accept the study strategies that work and throw away the ones that don’t.”

“It has helped me to develop a better understanding of effective study/learning strategies. Improved my metacognitive skills!”

“When I study and am overconfident in my skills, I think about metacognitive skills and test myself. This class helped me study better.”

Of course the major challenge with teaching students metacognition is that it is only half the battle to acquire knowledge about how learning works. I still struggle with motivating students to actually implement these strategies. Many are desirable difficulties (Bjork, 1994), feeling effortful and error-prone (and even frustrating) in the short term, and only showing benefits due to this initial challenge at a later time. I encourage students to use the strategies regularly, so that they become habits of mind, but I’m not convinced they consistently do so after one semester of exposure to this material. Yet the fact that they make statements such as the ones above gives me hope that they are integrating the Make It Stick ideas about metacognition into their lives.

Though this assignment has been part of a highly relevant course, Cognitive Psychology, the book Make It Stick (or selected chapters) could enhance a number of courses in and outside of psychology – as well as first-year seminars and similar courses that focus on student skill development, with the goal of teaching them how to be better learners.

References

Bjork, R. A. (1994). Memory and metamemory considerations in the training of human beings.

In J. Metcalfe & A. Shimamura (Eds.), Metacognition: Knowing about knowing (pp. 185–205). Cambridge, MA: MIT Press.

Brown, P. C., Roediger, H. L., & McDaniel, M. A. (2014). Make it stick: The science of successful learning. Cambridge, MA: The Belknap Press of Harvard University.

McCabe, J. (2011). Metacognitive awareness of learning strategies in undergraduates. Memory & Cognition, 39, 462–476. doi:10.3758/s13421-010-0035-2


Teaching Transformation Through Becoming a Student of Learning

by Patrick Cunningham, Rose-Hulman Institute of Technology,
Holly Matusovich & Sarah Williams, Virginia Tech

 Downloadable

Motivations and context:

I teach a variety of Mechanical Engineering courses at a small private undergraduate institution with approximately 2000 students. The courses I teach focus on the application of scientific theory and math to solve engineering problems. Since I started teaching I have been interested in how to help students to learn more deeply in my courses. This eventually led me to a sabbatical in the Department of Engineering Education at Virginia Tech, where I established a research partnership with Dr. Holly Matusovich, and later Ms. Sarah Williams, studying student metacognitive development. We have been interested in how to help students to become more sophisticated and lifelong learners and how to aid instructors in supporting this student development. This collaboration initiated a research-to-practice cycle, where my interest in enhancing student learning led to research on student metacognitive development, and research results have influenced my teaching practice.

Description of the process:

The research-to-practice cycle has transformed my teaching by helping me become a student of learning. For me the process has involved formal educational research, but it does not have to. My implementation of the cycle follows:

  1. Identify what teaching and learning issue you care about and develop partnerships.
  2. Plan the study.
  3. Implement the study and analyze the data.
  4. Interpret the results and use them to direct modifications to your teaching.
  5. Repeat steps 1-4.

I am interested in enhancing student learning and that led to collaborative metacognition research with Dr. Matusovich. Other possible partnerships may be with colleagues, your teaching and learning center, disciplinary education researchers (e.g., engineering or physics education), or even education researchers at your own institution (e.g., educational or cognitive psychology).

We planned the research through the preparation of a successfully funded NSF grant proposal. The process included establishing research questions, specifying study phases, determining what data to collect and how, and planning for data analysis. Even if you are not engaging in formal research, the quality and success of your study will depend on a well laid out plan. As a mechanical engineering professor, my collaborators proved to be indispensable partners for this.

Early in our research, we gathered baseline data through student interviews on how students approach learning in engineering science courses and how they define learning. We have found that students predominantly rely on working and reviewing example problems as a means of learning. This approach to learning falls into the category of rehearsal strategies, where students are seeking to memorize steps and match patterns rather than develop a richer conceptual understanding. While it is important to know facts, results from learning science show rehearsal strategies are insufficient for developing adequate conceptual frameworks that are necessary for transferring concepts to new situations and being able to explain their understanding effectively to others – key aspects of engineering work. To construct such rich conceptual frameworks students also need to engage in elaborative and organizational learning strategies, but students reported underutilization of these strategies. Students’ overreliance on example problems does not align with being able to apply course concepts to real-world problems.

In reviewing the data, I also realized that I might be part of the problem. My teaching and assessments had been primarily organized around working problems with little variation. The research helped me change. I decided to scaffold students’ use of a broader range of monitoring, elaborative, and organizational strategies by changing my approach to teaching. I realized that I could empower my students by helping them learn about and refine their learning skills – even as I teach the content of the course.

I made significant changes to my course. I changed the grade category for “homework” to “development activities” to include the regular homework, and new homework learning check quizzes and video quizzes. These quizzes provided low-stakes opportunities for formative feedback to students about their conceptual understanding. I also changed my classroom activities, engaging students in evaluating and explaining given solutions with errors, recall practice, interrogating examples with “what if” questions and answering them, and creating problems for specific concepts. For the next project steps, we are collecting data on these implementations so the research-to-practice cycle can begin again.

Outcomes:

My students performed at least as well on traditional problem solving exams as students in other sections of the same course. Importantly, they reported feeling more responsible for their learning and that they had to exert more effort in their learning than in other engineering science courses. For me, this has been a more fulfilling teaching experience. Not only have I found that students asked better questions about course content, but I also had more conversations with students about how they can learn more effectively and efficiently. It has added rigor and a clarity of purpose in my teaching that reaches beyond course content.

Lessons learned:

I learned to articulate the differences between my course and other courses and to get buy-in from students as to what I was trying to do. As a teacher, student resistance to change can be hard but it is worth it to improve teaching and learning experiences. Collaborative partnerships help!

Acknowledgement:

The metacognition research was supported by the National Science Foundation under Grant Nos. 1433757, 1433645, & 1150384. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.


Utilizing a Metacognition Chart for Exam Review and Metacognitive Skill Development

by Dana Melone, Cedar Rapids Kennedy High School

 Downloadable

Motivation and Context:

I teach AP Psychology at a Kennedy High School in Cedar Rapids, Iowa. My students range in age from 15-18 years old. They also come into my classroom with a variety of grade point averages ranging from below a 2.0 to above a 4.0. While some students have excellent, note taking and study skills as well an understanding of what they need to study, I find that most of my students (even the top ones) tend to try to study everything and get overwhelmed. They also do not utilize review time to their advantage.

At the same time my students love review games and in class review time. However, for years I was hesitant to play them or give them time to review in class because they would be so actively engaged in the game or review activity that they would not take the time to consider what they knew and what they did not know, and how this should effect their studying (i.e. practice metacognition about their learning). I wanted to engage them in demonstrations and games but I also wanted them to use those activities to guide their studying and develop effective learning strategies that could be used beyond my course. In response to this dilemma, I developed the metacognitive prompt chart below.

Nuts and Bolts:

In order to help students gauge how much they know, I have started requiring them to complete the metacognitive chart as they are reviewing in class or playing a review game I have also pointed out that they can use the chart even when they are working on current content. The chart consists of 4 columns that help students categorize their understanding of the concepts.

Students use the chart by putting the concept names in the column that best describes their understanding of each concept as we move through review activities or games. There are also two questions at the bottom that ask them about the focus of their studying and patterns they have seen over time. In the end, they have a chart that allows them in one glance to know what they need to study and what they have knowledge of.

  1. What concepts need to be the focus of your studying? How will you make sure you are studying them actively?
  2. Look at your past charts, what concepts seem to remain a problem? How will you address this?

My students have this chart out any time we are going over previously learned content or reviewing (multiple times a week). I encourage my students to re-examine their charts once a week to look for patterns over time and reflect on what they need to get help with or review. I also encourage them to combine any charts as we near assessments that are cumulative. Multiple times a month I collect the sheets and can visibly see areas that all my students are struggling. I have been able to use it for my own personal metacognition in planning review, re-teaching, and remediation times.

Outcomes:

This chart has proven its effectiveness on many levels. Students have been able to visibly see the areas that they need improvement in and should focus on. They are also able to examine these sheets over time to see patterns in the content that they are struggling with and doing well with. An unintended outcome has been the ability to use it as the teacher for ongoing formative assessment of my classes.


Metacognitive Time Capsule Assignments for Reflection on Writing Skills

by Sarah Robinson*, U. S. Air Force Academy
sarah.robinson@usafa.edu

Downloadable

Motivations and Context:

I teach upper level Geoscience courses on Remote Sensing and Imagery Analysis—essentially using satellite imagery to study earth surface materials and processes. In addition to the course objectives on imagery analysis, I also have a course objective on communication. Specifically, I want my students to be able to construct a convincing, clear, and concise written argument that communicates their analysis choices and subsequent results. Using imagery to analyze a geospatial problem is not enough; students also need to be able to write a convincing technical summary that communicates their analysis and results to others.

One of the challenges with communication course goals is that writing is often approached with a fixed mindset (Dweck, 2007), meaning it is viewed as some innate quality that you either have or you don’t. With a fixed mindset, it simply doesn’t make sense to expend effort on writing (ex. write drafts or read feedback) because there is no clear path for improvement—it is a fixed skill regardless of effort. However, if students alternatively view writing with a growth mindset they see writing as a skill that can be improved with practice and use of specific actions/strategies to make progress. Engaging in a growth mindset requires reflection on abilities and progress (self-awareness) and identification of strategies for improvement (self-regulation). In terms of writing, this translates into effort expended on practice in multiple assignments/drafts, reflection on progress and feedback, and identification of strategies to improve future writing assignments. Course design and assignments that promote metacognition through self-awareness and self-regulation can help students develop this growth mindset. Specifically, I incorporate systematic practice, actionable feedback and a time capsule reflection assignment in my course design.

Nuts and Bolts:

Systematic Practice:

Students have 3 lab assignments and a final project where they are asked to analyze geospatial problems using imagery and then summarize their analysis and results in 1-2 paragraphs. Each lab assignment has different geospatial problems, but the writing expectations and format are the same—for each geospatial problem, students write a summary that includes an introduction to the research problem, an explanation of their analysis choices in solving the problem, and an evaluation of their results. By having the same format (but different topic) for each writing assignment, students get systematic practice in writing a convincing, clear, and concise written argument.

Actionable Feedback:

The consistent format and expectations across writing assignments allows me to use the same rubric for every assignment. While the content changes with each assignment, students can reflect on their progress by looking at their rubric scores across the semester. For the first assignment, the rubric is the same, but a multiplier is applied to the score to compensate for their initial lack of familiarity with the format. In addition to rubric scores, I provide comments in the text (students submit electronic copies of their assignments) that provide actionable feedback on how to improve the next submission. Because the comments are relevant to a future assignment, students report that they engage in self-regulation by reading and using the feedback to improve their next assignment.

Metacognitive time capsule assignment

To support student self-awareness of their progress over the semester, I created a time capsule assignment where students compare their writing on the first lab with their writing on the final project. This assignment supports student metacognitive development because it asks students to develop self-awareness by reflecting on the changes they see in their writing. As all of their submissions are digital, students have copies of all their assignments and feedback across the semester. This assignment asks students specific questions to guide their reflection and asks them to provide example text from their assignments to support their statements. I am very clear in class that they receive full credit for participating in the assignment—they are not graded on what is in their answers, only on whether they provided complete answers.

There are two keys to this assignment for effective student reflection: the “time capsule” aspect and the consistent assignment format. Having students preserve and read their actual first writing assignment is critical—this first assignment essentially captures who they were at the beginning of the semester and preserves it, as in a time capsule, to be revealed intact at the end of the semester. The time capsule aspect allows for unfiltered, direct comparison by students of their skills then vs. their skills now that is not overwritten by their experiences during the semester.

The other key component is having a consistent assignment format to make comparison easier. This assignment would not have worked as well if students were comparing writing assignments that had very different formats or expectations. By keeping the format/expectations consistent, students are better able to see and explain their progress.

Outcomes:

I had trepidations about giving this time capsule assignment the first time I used it—I honestly didn’t know how students would respond. I was pleasantly surprised to see how engaged they were—instead of just writing their answers during class time, they were sharing with each other their comparisons between their first paragraphs and what they were then able to write for their final project. Their written answers documented their reflection on the changes they saw in their technical writing skills (self-awareness) and identified writing habits that they could continue/change in future classes (self-regulation).

Lessons Learned and future directions:

This type of time capsule assignment is something that I will continue to build into my courses. The planning required to design a consistent format and preserve early assignments is a small cost for the benefits of having students develop self-awareness and self-regulation and supporting a growth mindset.

Reference

Dweck, Carol S. (2007). “Mindset: The New Psychology of Success.” New York: Ballantine Books.

* Disclaimer: The views expressed in this document are those of the authors and do not reflect the official policy or position of the U. S. Air Force, Department of Defense, or the U. S. Govt.


Practicing Metacognitive Instruction Informs Continuous Class Improvement While Reinforcing Student Self-Awareness of Learning

by Lara Watkins (Bridgewater State University
lwatkins@bridgew.edu)

 Downloadable

Motivations and background:

I have been teaching a course titled “Anthropology of race, class and gender” for about five years at a state university. As a course that covers requirements for the core curriculum as well as for anthropology majors, the student population is diverse with first year through final semester students including both majors and non-majors. The course is taught in person with about 25 students per section.

I implemented a series of mid-course reflections for a variety of reasons. (1) I sought to encourage students to reflect upon their learning in the course as a way of helping them to recognize and assess their own learning over time, while (2) simultaneously providing an indicator of the main messages being retained by students to help in course planning for the future. The reflection served as a low stakes evaluation of learning, which then fed into continuous course improvement. Pragmatically, I was interested in (3) if there were specific barriers to student comprehension of the material that might make a substitute reading or focused classroom interventions appropriate. Since student metacognition about their learning can inform metacognitive instruction, I also sought (4) to assess the degree to which students saw value in a particular reading and (5) could link it to other course materials and their own learning, thereby encouraging learning across multiple levels of Bloom’s Taxonomy (Krathwohl, 2002).

Nuts and Bolts / Procedure:

Each semester, I incorporate four in-class, mid-course reflections. Students completed each pen-and-paper reflection in about five minutes. They had the option of handing in the reflection anonymously or adding their name to the form. Each mid-course reflection was about 3-5 questions long. The first question always asks students to state a few key points that they have learned in that particular section of the course, while the last question always provides students a place to anonymously raise questions and concerns. The middle questions vary depending on my evolving concerns or interests.

The middle questions on the reflection example shared here focused on the use of a full-length book; however, in general, the middle questions focus on a specific aspect of that portion of the course (e.g. a reading, the use of an online learning tool, etc.), that obviously is assigned to deepen learning, but through which student experiences could provide insight on the degree to which there are barriers to this ultimate goal. My goal in this particular example was to find out if students were engaged in the reading, if they were taking away the main ideas, and if there were noted challenges that could be mitigated in future iterations of the course.

Outcomes and Lessons Learned:

  • The first question (which asked for students to summarize what they had learned across a few weeks of the course) provides a useful snapshot of the main messages interpreted and retained by students. Through assessing student summary of information in question one, I have found that students were not able to reiterate key points to the same degree across different portions of the course, thereby suggesting which particular section(s) of the course needed further elaboration and attention in later semesters.
  • For this particular reflection example, I found that the students’ perspectives of the book did not align with my anticipation of their perspective. (The students were more positive than I expected.) Checking in with students throughout the semester helps to give the instructor a tangible and direct indicator of student interpretations of the course and course materials. This can feed into continuous course improvement.
  • This course meets twice a week on Mondays and Wednesdays. A key lesson learned from the student feedback is that they need lengthier readings to be due on the Monday. While this might appear intuitive, instructors sometimes lose sight of student logistics when constructing their syllabi and the multitude of topics to be covered. It also highlights the need to build in multiple, explicit reminders for students to start lengthy readings in advance.
  • Instructors implementing a similar activity in class will want to consider the benefits and drawbacks to allowing for anonymous submission. I chose for the feedback to be anonymous by default so that students would feel comfortable sharing their honest assessments and could clearly let me know if they had not completed the reading without feeling that it would impair their grade in any way. If instructors would like to track progress over time for individual students, then they may desire to have students identify themselves.   I have found it appropriate to individually email students who identify their name and raise a specific question/concern. Students often express gratitude for the personal outreach as it directly addresses a question or concern that they have, thereby decreasing their perceived barriers to success, and it conveys respect and concern for their individual learning trajectory, thereby cultivating a supportive learning climate.

This reflective approach provides a series of quick and useful indicators of student learning that I can use as an instructor to adjust my teaching and better support my students’ learning. A second benefit is that they help center the students’ attention on the metacognitive and higher-order processes of remembering, connecting, analyzing, and evaluating course concepts. Providing short assessments like this at a few time points across the semester is an easy way to “take the pulse” of a particular class and then use that feedback to identify teaching practices that are working well and those that might need to be tweaked. Metacognitive instruction leads to continuous course improvement and, ultimately, to better facilitation of student learning.

Reference

Krathwohl, D. R. (2002). A revision of Bloom’s taxonomy: An overview. Theory into Practice, 41, 212-218. http://dx.doi.org/10.1207/s15430421tip4104_2


Promoting Metacognition with Retrieval Practice in Five Ateps

by Blake Harvard, James Clemens High School

 Downloadable

 

Motivation for Activity

I am very lucky to work at a high school with students who are quite focused and, from the standpoint of intelligence, very gifted.   This does not make them great learners though. I realize a lot of my students clearly benefit from being able to memorize information. This may work in high school, where assessments are given sometimes daily. In college, however, this will not work. Assessment of material may consist of a midterm and a final. As a teacher who wants to better prepare my students for a lifetime of learning, I am motivated to introduce and cultivate learning strategies that focus on this personal growth and better understanding of their own learning through specific exercises promoting metacognition.

Context for Activity

I use this activity with my high school AP Psychology classes. These classes average about 30 students. Although my situation is quite specific, I believe this activity can easily be accommodated to fit most class sizes in almost all disciplines of study.

Description of Activity

Let me put all the cards out on the table: I am a big believer in using researched/proven learning strategies promoting metacognition to improve retention of classroom material. I have applied strategies in my high school Advanced Placement Psychology classes and seen notable improvements in three areas:

  • Test scores
  • Study habits
  • Student’s understanding of their learning

Improvement in test scores is important for many reasons and ultimately describes an overall level of understanding.  While I am thrilled to see my mean test score increase and standard deviation shrink a bit, that is not what I’m most excited about when lauding learning strategies.  I am far happier with the student growth with respect to their study habits and metacognition about their learning.  While I instruct highly intelligent adolescents, most of my students do not enter my room as great learners.  They are merely great memorizers.  There’s nothing inherently wrong with that, but it becomes much more difficult to just memorize your way through college and most of my students (80% to 90%) will attend university.

In particular, one learning strategy that I believe to be the most effective is retrieval practice.  The Learning Scientists provide a great overview of the strategy.  Basically, the idea is to attempt to retrieve information from your memory a bit after it’s been presented to you.  This can be done minutes, hours, or days later, and can be seen in many forms:  multiple-choice or matching questions, essays,. I have written before on the topic of retrieval practice and its impact on my classroom.  Today, I want to focus on how I promote metacognition through the use of retrieval practice in my classroom.

Usually the day after a lesson, I use these steps to practice retrieval of the information:

  1. Provide questions or a prompt.  Since I am preparing my students for an AP exam in May, I usually provide AP style questions (no more than 7). By ‘AP style’, I mean, either multiple-choice questions with five possible answers or an essay prompt requiring students use terms or concepts from the previous lesson to successfully relate their knowledge to a given scenario.
  2. Answer using only their brain.  This step starts to break their habit of asking those around them for help or looking at their notes/the internet for assistance.  In my opinion, this step is the most important aspect of retrieval practice.  They are forced to attempt to retrieve material as they practice answering test questions, which is the process in which they will have to engage during the actual test. A second benefit is that this practice can help to reduce test anxiety.  A lot of students shy away from this step because it can be difficult or because it highlights flaws in their learning, but I tell my students it’s definitely better to struggle with the material now than on the test.  If the test is the first time a student is presented with material in a way that utilizes the use of retrieval practice, we’ve all probably failed.
  3. Evaluate their answers.  How many answers are they very confident with?  How many answers are simply guesses?  I want students to understand that if they just guessed and answered correctly, they still don’t know the answer, they just got lucky.  Sometimes I’ll have my students delineate, by using a different color pen on their paper, answers they are confident with and those they are not.  This helps them to visualize their pre-grade understanding.
  4. Compare/contrast answers with neighbors.  I instruct the students to have a conversation; debate any discrepancies.  At this point, if they can thoughtfully discuss answers they probably have a decent grasp of the information and have taken time to reflect on their learning; specifically where their may be holes in their learning of the material or with what they thought they knew, but may have been mislead.
  5. Grade their paper.  After students grade their paper I want them thinking about the following questions that really allow the students to practice their metacognition and regulate/reinforce their study habits for future practice.

a. Does my grade reflect my knowledge?

b. Am I happy with my grade?

c. If no to either of the above questions, what strategies can I utilize to successfully retain the material?  At this point, many students incorrectly believe that their understanding of material is complete…for better or worse.  You can almost see them thinking either “Oh well, I just don’t know this” or “I scored well, I must know this”.  I attempt to impress upon my students that use of other strategies, like spaced practice and dual coding, will further aid to improve and solidify retention of the material.

d. If yes to the above questions, I ask students to reflect on what work they put in to remember this material so they can plan to use that strategy again for future learning. This step also helps reinforce that they should focus on learning strategies, not just guessing / luck.

Reflection

After many semesters of working with students, I have come to believe that metacognition and reflection on study habits/strategies is of foundational importance.  One of the goals I have for the students in my class, over the course of a semester, is these learning strategies become their norm for studying.  It’s not something extra, it is what they do to practice and learn.  Without the reflection piece of using retrieval practice and other learning strategies, it is hard for high school students to examine their study/practice growth. While walking the students through these five steps may seem a little laborious, the explicitness of the instructions seems to work well to increase their awareness of their own learning shape their behaviors toward more effective practices.

It is often quite difficult to convince teenagers their study/practice habits, that usually rely on simple memorization, will more than likely not be successful at college.  They need to see results from their added efforts.  Using these five steps, I have witnessed student’s grades improve and study/practice habits change for the better.  As a teacher, I’m not sure it gets any better…improving a student’s learning and making them more successful.  It’s why we get paid the big bucks.  🙂

References

Learn How to Study Using…Retrieval Practice, The Learning Scientists, www.learningscientists.org

Retrieval Practice in the High School Classroom, The Effortful Educator, www.effortfuleducator.com

Learn How to Study Using…Spaced Practice, The Learning Scientists, www.learningscientists.org

Learn How to Study Using…Dual Coding, The Learning Scientists, www.learningscientists.org


A Project-Based Method to Help Students Practice Study Strategies in an Authentic Context

by Hillary Steiner, Kennesaw State University

 Downloadable

Motivations and Context: Success in college requires the development of self-regulated learning strategies that move beyond high school skills, but teaching these strategies can be challenging. I teach a first-year seminar at a large comprehensive university that includes helping students develop college-level studying and time management skills among its goals. Knowing that students would be more likely to value these skills (and later, transfer these skills) if they were situated in context, I developed an assignment that requires students to practice self-regulated learning strategies—active reading, management of study time and achievement goals, proactive interaction with faculty, metacognitive reflection, and more—within the context of a student-selected course.

Assignment: In the Strategy Project assignment, students learn time management, communication, and study strategies in the process of preparing for an actual test. Students then demonstrate that learning by submitting their test preparation activities as part of a graded project in the first-year seminar.

First, students choose a test in another course that they find challenging. Then, they complete a contract, in consultation with their first-year seminar instructor, that indicates their individualized due dates and studying plans based on their chosen test. Students also write a pre-project reflection paper discussing their current approaches to studying and time management.

Next, the students complete a “professor interaction” activity where they visit the instructor of the chosen course to discuss a previous test or quiz, if applicable, and ask for advice about achieving success in that particular course. This portion of the project helps first-year students become comfortable interacting with their instructors and reinforces help-seeking behaviors. After this meeting, students develop a plan of study that outlines the strategies they will use to study for the test. This activity encourages effective time management and allows students to experience the benefit of study time that is distributed over several days.

Finally, the largest portion of the project requires students to complete a variety of metacognitive strategies such as textbook annotation, self-quizzing, concept-mapping, etc. Providing choices in strategies allows students to demonstrate metacognition by effectively matching studying techniques to their chosen test. After the test is graded and returned, students again complete a metacognitive reflection on the outcome of their studying habits in a short informal paper and presentation to the class.

Outcomes: For a number of years, I have studied the Strategy Project as a method for students to practice metacognition in an authentic, valuable context. I have used the project as a component in STEM learning communities that paired a first-year seminar with first-year STEM courses (e.g., Steiner, Dean, Foote, & Goldfine, 2016) as well as stand-alone first-year seminars (e.g., Steiner, 2016; 2017). Results from these studies have indicated that the project did raise awareness of, and encourage the use of, beneficial metacognitive strategies, and for most students, also increased their test scores in the chosen courses. One study’s preliminary findings (Steiner, 2017) also show a gain in self-reported metacognitive behaviors as measured by the Motivated Strategies for Learning Questionnaire (Pintrich, Smith, Garcia, & McKeachie, 1993). Anecdotally, students tell me that the Strategy Project was a powerful motivator to change high school habits that had become ineffective. Many students say that although they realized their strategies needed to change, without the incentive of a graded project, they would not have committed to changing their approaches. Students also have responded positively to learning more about metacognition in my first-year seminar (Steiner, 2014), suggesting that metacognition may be an important topic for others to address in similar seminars or “learning-to-learn” courses.

Lessons Learned and Future Directions: I continue to revise the Strategy Project yearly as I learn more from my students about its efficacy. To date, I mostly have used the Strategy Project in my own classroom. However, a colleague and I are planning a large-scale study of the Strategy Project which will compare the metacognitive gains made by students in sections of the first-year seminar that include the project versus those that do not. Because many faculty who teach the first-year seminar do not have a background in educational psychology, we will include professional development on metacognition and memory as part of the training. I look forward to continuing to revise the Strategy Project in light of others’ experiences using it. I would appreciate any feedback you or your students have on the effectiveness of this assignment in your own classroom.

References

Pintrich, P.R., Smith, D.A., Garcia, T., & McKeachie, W.J. (1993). Reliability and predictive validity of the Motivated for Learning Strategies Questionnaire (MSLQ). Education and Psychological Measurement, 53 (3), 801-814.

Steiner, H.H. (2017, March). Using a strategy project to promote self-regulated learning. Paper presented at the SoTL Commons Conference, Savannah, GA.

Steiner, H.H. (2016). The strategy project: Promoting self-regulated learning through an authentic assignment. International Journal of Teaching and Learning in Higher Education, 28 (2), 271-282.

Steiner, H.H.; Dean, M. L.; Foote, S.M; & Goldfine, R.A. (2016). The targeted learning community: A comprehensive approach to promoting the success of first-year students in general chemistry. In L.C. Schmidt & J. Graziano (Eds.), Building synergy for high-impact educational initiatives: First-year seminars and learning communities. Columbia, SC: National Resource Center.

Steiner, H.H. (2014). Teaching principles from cognitive psychology in the first-year seminar. E-Source for College Transitions, 11 (2), 14-16.


Metacognitive Reflection Assignments in Introductory Psychology

by Dennis Carpenter, University of Wisconsin-Colleges

 Downloadable

Motivations and context: These assignments focus on study strategies, goal setting, and reflection on the effectiveness of study strategies and the extent to which goals have been achieved. These assignments are used in Introductory Psychology courses at UW Richland, one of fourteen UW Colleges open-enrollment freshmen-sophomore liberal arts campuses throughout Wisconsin. These classes typically enroll 15-35 students in each of two sections per semester. A diverse range of students take these classes from varying ethnic/racial/language backgrounds and levels of academic preparation. Many students struggle with basic academic and study skills. Such skills have been emphasized in these courses over the 16 years I have taught in the UW Colleges and the present metacognitive reflection assignments represent an evolution of this work.

Nuts and Bolts: The materials included represent a series of three assignments used in the Spring 2017 semester. These assignments vary across semesters based on the students, information I encounter in my reading, and my own reflection on their impact in previous semesters. This series of assignments is introduced at the beginning of the semester. For new students, this occurs within the context of discussing ways that college might be different from high school in demands and strategies required for success. For continuing students, this occurs within a discussion about student perceptions of what is required for success in college based on their own experiences. The course textbook includes personal application sections (Improving Academic Performance and Improving Everyday Memory) that are assigned as part of the first week’s readings (Weiten , 2017, pp. 23-25, 252-255). Sternberg (2016) provides an excellent overview of evidence-based effective study strategies and tips for success. This was used for the first time in the Spring 2017 semester as a supplement to Weiten (2017).

In the first assignment, students are encouraged to write about study strategies they intend to use in the course as well as goals for the first unit of the course concluding with the first of four exams. The first assignment is graded very quickly with feedback given to students within a week of submission. In-class feedback typically includes a focus on writing goals in more clear and specific ways so they are attainable. Online worksheets are readily available for helping students write SMART goals and can be helpful at any stage in this sequence of assignments. The second assignment is due a week after the first exam. At that time, students reflect on their exam performance and the effectiveness of their study strategies as well as the extent to which they met their goals. Students are encouraged to refine their strategies and goals for the second unit based on the outcome of the first exam, having a better idea of what is required in the course, and greater insight into their own learning processes. The third assignment is due a week after the second exam. Again, students are asked to reflect on their exam performance, the effectiveness of their study strategies, and the extent to which they met their goals. Students are also encouraged to narrow their focus for this assignment and discuss two main strategies or goals they intend to focus on for the rest of the semester. Over the years, students have appeared to increasingly struggle with focused study given the multitasking demands of their electronic devices. For this reason, the third and final assignment also includes a reading about unplugging from devices and regaining control of one’s life (Weir, 2017), and questions about student experiences related to points made in this article.

Outcomes: I have witnessed students making significant changes in their approach to academic work with improvements in course performance over the semester. Unfortunately, many students have not seemed to benefit from such intervention, at least during the semester taking this course or in ways visible to me. Goal setting and evaluation routinely emerge as significant challenges for students. Distributed practice, self-testing, and minimizing distractions represent some of the more common strategies students report being successfully used. Improved management of electronic devices while studying has been one of the most significant outcomes for students revealed in these assignments.

Lessons learned and future directions:   I intentionally front-load these assignments to have maximum benefit. Students have perceived such assignments to be redundant when done after every exam in the past. In the future, I plan to re-introduce an end-of-the-semester reflection to better gauge the impact of these assignments. The student writing in these assignments provides a basis for one-on-one conversations with students about improving academic performance during office meetings. The positive impact of these assignments could be enhanced by structuring ways to have more follow-up conversations with students about their preferred study strategies and learning goals. I highly encourage integration of strategies for improving relationships with devices in any material on study strategies and metacognition. I welcome your feedback about how I might improve these attempts to improve student metacognition and look forward to learning more about your attempts to do so.

Readings Provided to Students

Sternberg, R. J. (2016). Introduction to optimizing learning in college: Tips from cognitive psychology. Perspectives on Psychological Science, 11(5), 642-660.

Weir, K. (2017, March). (Dis)connected. Monitor on Psychology, 48(3), 42-48.

Weiten, W. (2017). Psychology themes and variations (10th ed). Boston, MA: Cengage.

Grading rubrics: Rubric 1, Rubric 2, Rubric 3