The Challenge of Deep Learning in the Age of LearnSmart Course Systems

by Lauren Scharff, Ph.D. (U. S. Air Force Academy)

One of my close friends and colleague can reliably be counted on to point out that students are rational decision makers. There is only so much time in their days and they have full schedules. If there are ways for students to spend less time per course and still “be successful,” they will find the ways to do so. Unfortunately, their efficient choices may short-change their long-term, deep learning.

This tension between efficiency and deep learning was again brought to my attention when I learned about the “LearnSmart” (LS) text application that automatically comes with the e-text chosen by my department for the core course I’m teaching this semester. As a plus, the publisher has incorporated learning science (metacognitive prompts and spacing of review material) into the design of LearnSmart. Less positive, some aspects of the LearnSmart design seem to lead many students to choose efficiency over deep learning.

In a nutshell, the current LS design prompts learning shortcuts in several ways. Pre-highlighted text discourages reading from non-highlighted material, and the fact that the LS quiz questions primarily come from highlighted material reinforces those selective reading tendencies. A less conspicuous learning trap results from the design of the LS quiz credit algorithm that incorporates the metacognitive prompts. The metacognition prompts not only take a bit of extra time to answer, but students only get credit for completing questions for which they indicate good understanding of the question material. If they indicate questionable understanding, even if they ultimately answer correctly, that question does not count toward the required number of pre-class reading check questions. [If you’d like more details about the LS quiz process design, please see the text at the bottom of this post.]

Last semester, the fact that many of our students were choosing efficiency over deep learning became apparent when the first exam was graded. Despite very high completion of the LS pre-class reading quizzes and lively class discussions, exam grades on average were more than a letter grade lower than previous semesters.

The bottom line is, just like teaching tools, learning tools are only effective if they are used in ways that align with objectives. As instructors, our objectives typically are student learning (hopefully deep learning in most cases). Students’ objectives might seem to be correlated with learning (e.g. grades) or not (e.g. what is the fastest way to complete this assignment?). If we instructors design our courses or choose activities that allow students to efficiently (quickly) complete them while also obtaining good grades, then we are inadvertently supporting short-cuts to real learning.

So, how do we tackle our efficiency-shortcut challenge as we go into this new semester? There is a tool that the publisher offers to help us track student responses by levels of self-reported understanding and correctness. We can see if any students are showing the majority of their responses in the “I know it” category. If many of those are also incorrect, it’s likely that they are prioritizing short-term efficiency over long-term learning and we can talk to them one-on-one about their choices. That’s helpful, but it’s reactionary.

The real question is, How do we get students to consciously prioritize their long-term learning over short-term efficiency? For that, I suggest additional explicit discussion and another layer of metacognition. I plan to regularly check in with the students, have class discussions aimed at bringing their choices about their learning behaviors into their conscious awareness, and positively reinforcing their positive self-regulation of deep-learning behaviors.

I’ll let you know how it goes.

——————————————–

Here is some additional background on the e-text and the complimentary LearnSmart (LS) text .

There are two ways to access the text. One way is an electronic version of the printed text, including nice annotation capabilities for students who want to underline, highlight or take notes. It’s essentially an electronic version of a printed text. The second way to access the text is through the LS chapters. As mentioned above, when the students open these chapters, they will find that some of the text has already been highlighted for them!

As they read through the LS chapters, students are periodically prompted with some LS quiz questions (primarily from highlighted material). These questions are where some of the learning science comes in. Students are given a question about the material. But, rather than being given the multiple choice response options right away, they are first given a metacognitive prompt. They are asked how confident they are that they know the answer to the question without seeing the response options. They can choose “I know it,” “Think so,” “Unsure,” or “No idea.” Once they answer about their “awareness” of their understanding, then they are given the response options and they try to correctly answer the question.

This next point is key: it turns out that in order to get credit for question completion in LS, students must do BOTH of the following: 1) choose “I know it” when indicating understanding, and 2) answer the question correctly. If students indicate any other level of understanding, or if they answer incorrectly, LS will give them more questions on that topic, and the effort for that question won’t count towards completion of the required number of questions for the pre-class activity.

And there’s the rub. Efficient students quickly learn that they can complete the pre-class reading quiz activity much more quickly if they chose “I know it” to all the metacognitive understanding probes prior to each question. If they guess at the subsequent question answer and get it correct, it counts toward their completion of the activity and they move on. If they answer incorrectly, LS would give them another question from that topic, but they weren’t any worse off with respect to time and effort than if they had indicated that they weren’t sure of the answer.

If students actually take the time to take advantage of rather than shortcut the LS quiz features (there are additional ones I haven’t mentioned here), their deep learning should be enhanced. However, unless they come to value deep learning over efficiency and short-term grades (e.g. quiz completion), then there is no benefit to the technology. In fact it might further undermine their learning through a false sense of understanding.