Supporting Student Self-Assessment with Knowledge Surveys

by Dr. Lauren Scharff, U. S. Air Force Academy*

In my earlier post this year, “Know Cubed” – How do students know if they know what they need to know?, I introduced three challenges for accurate student self-assessment. I also introduced the idea of incorporating knowledge surveys as a tool to support student self-assessment (an aspect of metacognitive learning) and promote metacognitive instruction. This post shares my first foray into the use of knowledge surveys.

What exactly are knowledge surveys? They are collections of questions that support student self-assessment of their course material understanding and related skills. Students complete the questions either at the beginning of the semester or prior to each unit of the course (pre), and then also immediately prior to exams (post-unit instruction). When answering the questions, students rate themselves on their ability to answer the question (similar to a confidence rating) rather than fully answering the question. The type of learning expectation is highlighted by including the Bloom’s level at the end of each question. Completion of knowledge surveys develops metacognitive awareness of learning and can help guide more efficient studying.

Example knowledge survey questions
Example knowledge survey questions

My motivation to include knowledge surveys in my course was a result of a presentation by Dr. Karl Wirth, who was invited to be the keynote speaker at the annual SoTL Forum we hold at my institution, the United States Air Force Academy. He shared compelling data and anecdotes about his incorporation of knowledge surveys into his geosciences course. His talk inspired several of us to try out knowledge surveys in our courses this spring.

So, after a semester, what do I think about knowledge surveys? How did my students respond?

In a nutshell, I am convinced that knowledge surveys enhanced student learning and promoted student metacognition about their learning. Their use provided additional opportunities to discuss the science of learning and helped focus learning efforts. But, there were also some important lessons learned that I will use to modify how I incorporate knowledge surveys in the future.

Evidence that knowledge surveys were beneficial:

My personal observations included the following, with increasing levels of each as the semester went on and students learned how to learn using the knowledge survey questions:

  • Students directly told me how much they liked and appreciated the knowledge survey questions. There is a lot of unfamiliar and challenging content in this upper-level course, so the knowledge survey questions served as an effective road map to help guide student learning efforts.
  • Students asked questions in class directly related to the knowledge survey questions (as well as other questions). Because I was clear about what I wanted them to learn, they were able to judge if they had solid understanding of those concepts and ask questions while we were discussing the topics.
  • Students came to office hours to ask questions, and were able to more clearly articulate what they did and did not understand prior to the exams when asking for further clarifications.
  • Students realized that they needed to study differently for the questions at different Bloom’s levels of learning. “Explain” questions required more than basic memorization of the terms related to those questions. I took class time to suggest and reinforce the use of more effective learning strategies and several students reported increasing success and the use of those strategies for other courses (yay!).
  • Overall, students became more accurate in assessing their understanding of the material prior to the exam. More specifically, when I compared the knowledge survey reports with actual exam performance, students progressively became more accurate across the semester. I think some of this increase in accuracy was due to the changes stated in points above.

Student feedback included the following:

  • End-of-semester feedback from students indicated that vast majority of them thought the knowledge surveys supported their learning, with half of them giving them the highest rating of “definitely supports learning, keep as is.”
  • Optional reflection feedback suggested development of learning skills related to the use of the knowledge surveys and perceived value for their use. The following quote was typical of many students:

At first, I was not sure how the knowledge surveys were going to help me. The first time I went through them I did not know many of the questions and I assumed they were things I was already supposed to know. However, after we went over their purpose in class my view of them changed. As I read through the readings, I focused on the portions that answered the knowledge survey questions. If I could not find an answer or felt like I did not accurately answer the question, I bolded that question and brought it up in class. Before the GR, I go back through a blank knowledge survey and try to answer each question by myself. I then use this to compare to the actual answers to see what I actually need to study. Before the first GR I did not do this. However, for the second GR I did and I did much better.

Other Observations and Lessons learned:

Although I am generally pleased with my first foray into incorporating knowledge surveys, I did learn some lessons and I will make some modifications next time.

  • The biggest lesson is that I need to take even more time to explain knowledge surveys, how students should use them to guide their learning, and how I use them as an instructor to tailor my teaching.

What did I do this past semester? I explained knowledge surveys on the syllabus and verbally at the beginning of the semester. I gave periodic general reminders and included a slide in each lesson’s PPT that listed the relevant knowledge survey questions. I gave points for completion of the knowledge surveys to increase the perception of their value. I also included instructions about how to use them at the start of each knowledge survey:

Knowledge survey instructions
Knowledge survey instructions

Despite all these efforts, feedback and performance indicated that many students really didn’t understand the purpose of knowledge surveys or take them seriously until after the first exam (and some even later than that). What will I do in the future? In addition to the above, I will make more explicit connections during the lesson and as students engage in learning activities and demonstrations. I will ask students to share how they would explain certain concepts using the results of their activities and the other data that were presented during the lesson. The latter will provide explicit examples of what would (or would not) be considered a complete answer for the “explain” questions in contrast to the “remember” questions.

  • The biggest student feedback suggestion for modification of the knowledge surveys pertained to the “pre” knowledge surveys given at the start of each unit. Students reported they didn’t know most of the answers and felt like completion of the pre knowledge surveys was less useful. As an instructor, those “pre” responses helped me get a pulse on their level or prior knowledge and use that to tailor my lessons. Thus, I need to better communicate my use of those “pre” results because no one likes to take time to do what they perceive is “busy work.”
  • I also learned that students created a shared GoogleDoc where they would insert answers to the knowledge survey questions. I am all for students helping each other learn, and I encourage them to quiz each other so they can talk out the answers rather than simply re-reading their notes. However, it became apparent when students came in for office hours that the shared “answers” to the questions were not always correct and were sometimes incomplete. This was especially true for the higher-level questions. I personally was not a member of the shared document, so I did not check their answers in that document. In the future, I will earlier and more explicitly encourage students to be aware of the type of learning being targeted and the type of responses needed for each level, and encourage them to critically evaluate the answers being entered into such a shared document.

In sum, as an avid supporter of metacognitive learning and metacognitive instruction, I believe that knowledge surveys are a great tool for supporting both student and faculty awareness of learning, the first step in metacognition. We then should use that awareness to make necessary adjustments to our efforts – the other half of a continuous cycle that leads to increased student success.

———————————————–

* Disclaimer: The views expressed in this document are those of the author and do not reflect the official policy or position of the U. S. Air Force, Department of Defense, or the U. S. Govt.