Promoting academic rigor with metacognition

By John Draeger (SUNY Buffalo State)

A few weeks ago, I visited Lenoir-Rhyne University to talk about promoting academic rigor and I was reminded of the importance of metacognition. College faculty often worry that students are arriving in their courses increasingly underprepared and they often find it difficult to maintain the appropriate level of academic rigor. Faced with this challenge, some colleagues and I developed a model for promoting academic rigor. According to this model, promoting academic rigor requires actively engaging students in meaningful content with higher-order thinking at the appropriate level of expectation for a given context (Draeger, del Prado Hill, Hunter, and Mahler, 2013). The model (see FIGURE ONE) can be useful insofar as it can prompt reflection and frame conversation.  In particular, faculty members can explore how to improve student engagement, how to uncover a course’s most meaningful elements, how to determine the forms of higher-order thinking most appropriate for a course, and how to modulate expectations for different student groups (e.g., majors, non-majors, general education, honors). There is nothing particularly magical about this model. It is one of many ways that college instructors might become more intentional about various aspect of course design, instruction, and assessment. However, I argue that promoting academic rigor in these way requires metacognition.

In a previous post, Lauren Scharff and I argued that metacognition can be used to select the appropriate teaching and learning strategy for a given context (Draeger & Scharff, 2016). More specifically, metacognition can help instructors “check in” with students and make meaningful “in the moment” adjustments. Similarly, engaging students in each of the components of the rigor model can take effort, especially because students often need explicit redirection. If instructors are monitoring student learning and using that awareness to make intentional adjustments, then they are more likely to encourage students to actively engage meaningful content with higher-order thinking at the appropriate level of expectation.

Consider, for example, a course in fashion merchandising. Students are often drawn to such a course because they like to shop for clothes. This may help with enrollment, but the goal of the course is to give students insight into industry thinking. In particular, students need to shift from a consumer mentality to the ability to apply consumer behavior theory in ways that sell merchandise. What would it mean to teach such a course with rigor? The model of academic rigor sketched above recognizes that each of the components can occur independently and not lead to academic rigor. For example, students can be actively engaged in content that is less than meaningful to the course (e.g., regaling others with shopping stories) and students can be learning meaningful content without being actively engaged (e.g., rote learning of consumer behavior theory). Likewise, students can be actively and meaningfully engaged with or without higher-order thinking. The goal, however, is to have multiple components of the model occur together, i.e. to actively engage students in meaningful content with higher-order thinking at the appropriate level of expectations. In the case of fashion merchandising, a professor might send students to the mall to have them use consumer behavior theory to justify why a particular rack of clothes occupies a particular place on the shop floor. If they can complete this assignment, then they are actively engaged (at the mall) in meaningful content (consumer behavior theory) with higher-order-thinking (applying theory to a rack of clothes). Metacognition requires that instructors monitor student learning and use that awareness to make intentional adjustments. If a fashion merchandising instructor finds students lapsing into stories about their latest shopping adventures, then the instructor might redirect the discussion towards higher-order-thinking with meaningful content by asking the students to use consumer behavior theory to question their assumptions about their shopping behaviors.

Or consider a course in introductory astronomy (Brogt & Draeger, 2015). Students often choose such a course to satisfy their general education requirements because they think it has something to do with star gazing and it is preferable to other courses, like physics. Much to their surprise, however, students quickly learn that astronomy is physics by another name. Astronomy instructors struggle because students in introductory astronomy often lack the necessary background in math and science. The trick, therefore, is to make the course rigorous when students lack the usual tools. One solution could be to use electromagnetic radiation (a.k.a. light) as the touchstone concept for the course. After all, light is the most salient evidence we have for occurrences far away. As such, it can figure into conversations about the scientific method, including scientific skepticism about various astronomical findings. Moreover, even if students cannot do precise calculations, it might be enough that they be able to estimate the order-of-magnitude of distant stars. Astronomy instructors have lots of great tools for actively engaging students in order-of-magnitude guesstimates. These can be used to scaffold students into understanding how answers to order-of-magnitude estimates involving light can provide evidence about distant objects. If so, then students are actively engaging meaningful content with higher-order thinking at a level appropriate to an introductory course satisfying a general education requirement. Again, metacognition can help instructors make intentional adjustments based on “in the moment” observations about student performance. If, for example, an instructor finds that students “check out” once mathematical symbols go up on the board, the instructor can redouble efforts to highlight the importance of understanding order-of-magnitude and can make explicit the connection between previous guesstimate exercises and the symbols on the board.

If tools for reflection (e.g., a model of academic rigor) help instructors map out the most salient aspects of a course, then metacognition is the mechanism by which instructors navigate that map. If so, then I suggest that promoting academic rigor requires metacognition. It is important to understand how we can help students actively engage in meaningful course content with higher-order-thinking at the appropriate level of expectation for a given course. However, consistently shepherding students to the intersection of those elements requires  metacognitive awareness and self-regulation on behalf of the instructor.

References

Brogt, E. & Draeger, J. (2015). “Academic Rigor in General Education, Introductory Astronomy Courses for Nonscience Majors.” The Journal of General Education, 64 (1), 14-29.

Draeger, J. (2015). “Exploring the relationship between awareness, self-regulation, and metacognition.”  Retrieved from https://www.improvewithmetacognition.com/exploring-the-relationship-between-awareness-self-regulation-and-metacognition/

Draeger, J., del Prado Hill, P., Hunter, L. R., Mahler, R. (2013). “The Anatomy of Academic Rigor: One Institutional Journey.” Innovative Higher Education 38 (4), 267-279.

Draeger, J. & Scharff, L. (2016). “Using Metacognition to select and apply appropriate teaching strategies.” Retrieved from https://www.improvewithmetacognition.com/using-metacognition-select-apply-appropriate-teaching-strategies/


Teacher, Know Thyself (Translation: Use Student Evaluations of Teaching!)

by Guy Boysen, Ph.D., McKendree University

I’ll call him Donald. I am willing to bet that you know a Donald too. Students fled from Donald’s classes in droves. His was the pedagogy of narcissism – “I am the Lord thy teacher and thou shall have no classroom activities unfocused on me!” Donald’s grading system was so subjective, vindictive, and Byzantine as to be barely defensible. Enrollment in his classes always followed the same pattern: intro course full of students who did not know any better at the start of the semester and then decimation by the end of the semester; advanced seminars empty except for a few adoring students with Stockholm syndrome. Asked about his student evaluations, Donald would say “My seminar evals are good, but I don’t even look at my intro evals anymore – they don’t know about teaching.”

Donald calls to mind the classic metacognitive phenomenon of being unskilled and unaware of it (Kruger & Dunnnig, 1999; Lodge, 2016; Schumacher, Akers, & Taraban, 2016). This is something teachers see in students all of the time; bad students overestimate their abilities and therefore don’t work to improve. As illustrated by Donald, this phenomenon applies to teachers as well.

There are a number of wide-ranging characteristics that make someone a model teacher (Richmond, Boysen, & Gurung, 2016), but the use of student evaluations to improve teaching is one that has a strong metacognitive component. Student evaluations provide teachers with feedback so that they can engage in metacognitive analysis of their pedagogical skills and practices. Based on that analysis, goals for improvement can be set and pursued.

Recommendations for Using Student Evals

How should teachers use student evaluations to develop metacognitive awareness of their pedagogical strengths and weakness? Several suggestions can be found in An Evidence-Based Guide for College and University Teaching: Developing the Model Teacher (Richmond, Boysen, & Gurung, 2016).

Set goals for improvement and follow through with them.

Have you ever gotten on a scale and not liked the number staring back at you? Did you just get on and off the scale repeatedly expecting the number to change? No? Well, that trick doesn’t work in teaching either. Collecting student evaluations without using them to set and implement goals for improvement is like a diet that only consists of repeated weigh-ins – the numbers will not change without the application of direct effort. Use your student evaluations, preferably in collaboration with a mentor or teaching expert, to set manageable goals for change. 

Select the correct assessment tool.    

Wouldn’t it be great if we could select our own teaching evaluations? Mine might look something like this.

But wait! You can select your own teaching evaluations. Official, summative evaluations may be set at the institutional level, but teachers can implement any survey they want for professional development purposes. Choose wisely, however.

If you are a numbers person, select a well-researched measure that provides feedback across several dimensions of teaching that are relevant to you. Perhaps the best known of these is the Student Evaluation of Educational Quality (SEEQ), which measures teaching quality across nine different factors (Marsh, 1983). The advantages to this type of measure are that the results can be scientifically trusted and are detailed enough to inform goals for improvement.

Not a numbers person? You might ask for written comments from students. Whatever you want to know about your teaching, you can simply ask – believe me, students have opinions! Although analyzing student comments can be laborious (Lewis, 2001), they can offer unequalled richness and specificity. Beware of asking for general feedback however. General questions tend to elicit general feedback (Hoon, Oliver, Szpakowska, & Newton, 2015). Rather, provide specific prompts such as the following.

  • What should I/we STOP doing in this class?
  • What should I/we START doing in this class?
  • What should I/we CONTINUE doing in this class? 

Don’t wait until the end of the semester.

Imagine if Donald could get feedback from the students who drop his classes. Perhaps he could make pedagogical changes to reach those students before they flee. Guess what, he can! Formative assessment is the key.

Teachers often allow official, end-of-semester student evaluations to serve as their only feedback from students. The problem with this approach is that the feedback comes too late to make midsemester course corrections. This is analogous to the metacognitive importance of providing students with early feedback on their performance. You wouldn’t expect students to succeed in your course if a final exam was the only grade, would you? Well, don’t put yourself in the same position. Model teachers ask for student feedback both at the end of the semester (i.e., summative) and early enough in the semester to make immediate improvements (i.e., formative).

Make changes large and small.

Student evaluations can be used to inform revisions to all levels of pedagogy. Imagine that students report being absolutely bewildered by a concept in your class. Potential responses to this feedback could be to change (a) the time spent on the concept in class, (b) scaffolding of knowledge needed to understand the concept, (c) the availability of study aids related to the concept, (d) the basic instructional technique used to teach the concept, or (e) the decision to even include the concept in the course. For model teachers, student feedback can inform changes large and small.

Conclusion

Every single semester students comment on my evaluations that they want the tests to be multiple choice rather than short answer/essay, and every semester I tell students that I will not be changing the test format because students do not study as hard for multiple-choice tests. Thus, my point is not that model teachers incorporate all student feedback into their courses. However, failure to respond should be a sound and intentional pedagogical choice rather than a Donald-like failure of metacognition – don’t be caught unskilled and unaware.

References

Hoon, A., Oliver, E., Szpakowska, K., & Newton, P. (2015). Use of the Stop, Start, Continue method is associated with the production of constructive qualitative feedback by students in higher education. Assessment & Evaluation in Higher Education, 40, 755-767. doi:10.1080/02602938.2014.956282

Kruger, J., & Dunning, D. (1999). Unskilled and unaware of it: how difficulties in recognizing one’s own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77, 1121-1134.

Lewis, K. G. (2001). Making sense of student written comments. New Directions for Teaching and Learning, 87, 25-32.

Lodge, J. (2016) Hypercorrection: Overcomming overconfidence with metacognition. Retreived from https://www.improvewithmetacognition.com/hypercorrection-overcoming-overconfidence-metacognition/

Marsh, H. W. (1983). Multidimensional ratings of teaching effectiveness by students from different academic settings and their relation to student/course/instructor characteristics. Journal of Educational Psychology, 75, 150-166. doi:10.1037/0022-0663.75.1.150

Richmond, A. S., Boysen, G. A., Gurung, R. A. R. (2016). An evidence-based guide for college and university teaching: Developing the model teacher. Routledge.

Schumacher, J. R., Akers, E. & Taraban, R. (2016). Unskilled and unaware: A metacognitive bias. Retrieved from https://www.improvewithmetacognition.com/unskilled-unaware-metacognitive-bias/


New Year Metacognition

by Lauren Scharff, Ph.D., United States Air Force Academy *

Happy New Year to you! This seasonal greeting has many positive connotations, including new beginnings, hope, fresh starts, etc. But, it’s also strongly associated with the making of new-year resolutions, and that’s where the link to metacognition becomes relevant.

As we state on the Improve with Metacognition home page, “Metacognition refers to an intentional focusing of attention on the development of a process, so that one becomes aware of one’s current state of accomplishment, along with the situational influences and strategy choices that are currently, or have previously, influenced accomplishment of that process. Through metacognition, one should become better able to accurately judge one’s progress and select strategies that will lead to success.”

Although this site typically focuses on teaching and learning processes, we can be metacognitive about any process / behavior in which we might engage. A new year’s resolution typically involves starting a new behavior that we might deem to be healthier for us, or stopping an already established behavior that we deem to be unhealthy for us. Either way, some effort is likely to be involved, because if it was going to be easy, we wouldn’t create a resolution to make the change.

Effort alone, however, is unlikely to lead to success. Just like students who “study harder” without being metacognitive about it, people who simply “try hard” to make a change will often be unsuccessful. This is because most behaviors, including learning, are complex. There are a multitude of situational factors and personal predispositions that interact to influence our success in obtaining our behavioral goals. Thus, it’s unlikely that a single strategy will work at all times. In fact, persisting with an ineffective strategy will lead to frustration, cynicism, and the giving up on one’s resolution.

Now, typically, I am not the sort of person who actually makes new-year resolutions. But this new year presents a new situation for me. I will be on sabbatical and working from home. I have prepared a fairly ambitious list of professional development activities that I hope to accomplish. I know I am capable of each of them. But, I also know that I will be working in an environment with a different group of distractions and without many external deadlines. Instead of committee work, grading, short turn-around taskers, and meetings with students and colleagues preventing me from working on my publications and other professional development activities, I will have a dog with big brown eyes who would love to go for a walk, children who need attention when they’re home from school, and projects at home that I usually can put out of mind when I’m at the office.

My resolution to myself for the coming 6 months of my sabbatical is that I will create a positive work environment for myself and accomplish my list of professional development activities while maintaining a balance with my family and personal goals. I know that I will need a variety of strategies, and that I will need to take time to reflect on the state of my progress and show self-regulation in my choice of strategies at different times. I plan to use a journal to help me with my awareness of the alignment between my daily goals and the activities in which I choose to engage in order to accomplish those goals.[1] This awareness will guide my self-regulation when, inevitably, I get off track. I also plan to make some public commitments and provide updates to my friends and colleagues regarding specific goals I plan to accomplish at specific times, as public commitment provides motivation, often results in social support, and is another way to encourage self-awareness and self-regulation, i.e. metacognition.

I’ll let you know how it goes in 6 months. 🙂  Meanwhile, Happy New Year and all the best to you with your new-year resolutions. Try using the tools of metacognition to help you succeed!

[1] See our preliminary research summary about the effectiveness of instructors using journals to enhance their metacognitive instruction.

* Disclaimer: The views expressed in this document are those of the author and do not reflect the official policy or position of the U. S. Air Force, Department of Defense, or the U. S. Govt.