Critically Thinking about our Not-So-Critical Thinking in the Social World

By Randi Shedlosky-Shoemaker and Carla G. Strassle, York College of Pennsylvania

When people fail to engage in critical thinking while navigating their social world, they inevitably create hurdles that disrupt their cultural awareness and competence. Unfortunately, people generally struggle to see the hurdles that they construct (i.e., bias blind spot; Pronin, Lin, & Ross, 2002). We propose metacognition can be used to help people understand the process by which they think about and interact with others.

a photo montage of face images from a large variety of people

The first step is to reflect on existing beliefs about social groups, which requires people to examine the common errors in critical thinking that they may be engaging in. By analyzing those errors, people can begin to take down the invisible hurdles on the path to cultural awareness and competency. Using metacognition principles collected by Levy (2010), in this post we discuss how common critical thinking failures affect how people define and evaluate social groups, as well as preserve the resulting assumptions. More importantly, we provide suggestions on avoiding those failures.

Defining Social Groups

Social categories, by their very nature, are social constructs. That means that people should not think of social categories in terms of accuracy, but rather utility (Levy, 2010, pp. 11-12). For example, knowing a friend’s sexual orientation might help one consider what romantic partners their friend might be interested in. When people forget that dividing the world into social groups is not about accurately representing others but rather a mechanism to facilitate social processes, they engage in an error known as reification. In relation to social groups, this error can also involve using tangible, biological factors (e.g., genetics) as the root cause of social constructs (e.g., race, gender). To avoid this reification error, people should view biological and psychological variables as two separate, but complementary levels of description (Levy, 2010, pp. 15-19), and remember that social categories are only important if they are useful.

Beyond an inappropriate reliance on biological differences to justify the borders between social groups, people often oversimplify those groups. Social categories are person-related variables, which are best represented on a continuum; reducing those variables to discrete, mutually exclusive groups, creates false dichotomies (Levy, 2010, pp. 26-28). False dichotomies, such as male or female, make it easier to overlook both commonalities shared by individuals across different groups as well as differences that exist between members within the same group.

Overly simplistic dichotomies also support the assumption that two groups represent the other’s polar opposite (e.g., male is the polar opposite of female, Black is the polar opposite of White). Such an assumption means ignoring that individuals can be a member of two supposedly opposite groups (e.g., identify as multiple races/ethnicities) or neither group (e.g., identify as agender).

Here, metacognition promotes reflection on the criteria used for defining group memberships. In that reflection, people should consider whether the borders that they apply to groups are too constraining, leading them to misrepresent individuals with whom they interact. Additionally, people should consider ways in which seemingly different groups can have shared features, while also still maintaining some degree of uniqueness (i.e., similarity-uniqueness paradox, Levy, 2010, pp. 39-41). By appreciating the nature and limitations of the categorization process, people can reflect upon whether applications of group memberships are meaningful or not.

Evaluating Social Groups

Critical thinking failures that occur when defining social categories are compounded when people move from describing social groups into evaluating those social groups (i.e., evaluative bias of language, Levy, 2010, pp. 4-7). In labeling social others, people often speak to what they have learned to see as different. As more dominant groups retain the power to set the standards, people may learn to use the dominant groups as the default (i.e., cultural imperialism; Young, 1990). For example, when people describe others as “that older woman”… “that kid”… “that blind person”… and so on – their chosen label conveys what they see as divergent from the status quo. By becoming more aware of the language they use, people simultaneously become more aware of how they think about social others based on social grouping. In monitoring and reflecting on language, metacognition affords us a valuable opportunity to adapt thinking through language.

Changing language can be challenging, however, particularly when people find themselves in environments that lack diversity. Frequently, people find themselves surrounded by others who look, think, and act like them. When surrounded by others who largely represent one’s self, unreflective attempts to make sense of the world may naturally echo their point of view. This is problematic for two reasons: first, people tend to rely more on readily available information in decision-making and judgments (i.e., availability heuristic, Tversky & Kahneman, 1974).

Further, with one’s own views reflected back at them, people easily overestimate how common their beliefs and behaviors are (i.e., false consensus effect, Ross, Greene, & House, 1977). That inaccurate assessment of “common” can lead people to conclude that such beliefs and behaviors are also “good”. Conversely, what is seen as different or uncommon, relative to the self, becomes “bad” (i.e., naturalistic fallacy, Levy, 2010, pp. 50-51).

By pausing to assess the variability of perspectives people have access to, metacognition allows people to consider what perspectives they are missing. In that way, people can more intentionally seek out ideas and experiences that may be different from their own.

Preserving Assumptions

Though not easy, breaking away from one’s point of view and seeking out diverse perspectives can also address another hurdle that people create for themselves: specifically, the tendency to preserve one’s existing assumptions (i.e., belief perseverance phenomenon; e.g., Ross & Anderson, 1982). Change takes work, and not surprisingly, people often choose the path of least resistance – that is, to make new information fit into the system we already have (i.e., assimilation bias, Levy, 2010, pp. 154-156).

Further, people tend to seek out information that supports existing beliefs while disregarding or discounting disconfirming information (i.e., confirmation bias, Levy, 2010, pp. 164-165). Given the habit of sticking to what fits with existing beliefs, people develop an illusion of consensus. Existing beliefs are reinforced when people fail to realize that such beliefs inadvertently influence behaviors, which in turn shape interaction, thereby creating situations that further support, rather than challenge, existing belief systems (i.e., self-fulfilling prophecy, e.g., Wilkins, 1976).

This tendency then, to protect what one already “knows” speaks to the necessity of metacognition to challenge one’s existing belief system. When people analyze and question their existing beliefs they can begin to recognize where revision of those existing beliefs is needed and choose to acquire new perspectives to do so.

Summary

So many of the critical thinking failures above occur without much effortful or conscious awareness on our part. Engaging in metacognition, and non-defensively addressing the unintentional errors one makes, allows people to break down common hurdles that disrupt cultural awareness and competency. It’s when people critically reflect upon their thought processes, identifying the potential errors that may have shaped their existing perspectives, that they can begin to change how they think and feel about social others. In terms of developing a heightened sense of cultural awareness and competency, metacognition then helps us all realize that the world is a much more complex though interesting place.

References

Levy, D.A. (2010). Tools of critical thinking: Metathoughts for psychology. Waveland Press.

Pronin, E., Lin, D. Y., & Ross, L. (2002). The bias blind spot: Perceptions of bias in self versus others. Personality and Social Psychology Bulletin, 28, 369-381. https://doi.org/10.1177/0146167202286008

Ross. L., & Anderson, C. (1982). Shortcomings in the attribution process: On the origins and maintenance of erroneous social assessments. In D. Kahneman, P. Siovic, & A. Tversky (Eds.), Judgment under uncertainty: Heuristics and biases. Cambridge Univ. Press.

Ross, L., Greene, D., & House, P. (1977).The “false consensus effect”: An egocentric bias in social perception and attribution processes. Journal of Experimental Social Psychology, 13, 279-301. https://doi.org/10.1016/0022-1031(77)90049-X

Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 27, 1124-1131. https://doi.org/10.1126/science.185.4157.1124

Wilkins, W. E. (1976). The concept of a self-fulfilling prophecy. Sociology of Education, 49, 175–183. https://doi.org/10.2307/2112523

Young, I. (1990). Justice and the politics of difference. Princeton University Press.