The Onion has created a dark humor Groundhog Day response to school and mass shootings in the U.S.: ‘No Way To Prevent This,’ Says Only Nation Where This Regularly Happens.
Public discourse and social media discussions suffer from something as predictable after school shootings as well—a fruitless clash of ideological claims, often bereft of evidence or historical context.
As an educator and a scholar, I feel compelled to advocate for safety in our society and our schools; therefore, I routinely address the research base on gun violence and school safety through my Twitter feed, on Facebook, and in my blogging.
Here’s a pattern I witness each time.
I post something about gun violence and school shootings, and someone comments with a claim that the school shootings are the result of a decline in morals, occasionally tossing in a reference to taking God and prayer out of schools (this last part is, by the way, entirely false as forced prayer has been deemed unconstitutional in public schools, but everyone in those schools are free to pray without interference).
This popped up after the shooting in Santa Fe, Texas, so I simply responded by asking what evidence exists that the U.S. was ever a moral/ethical country and thus how do we prove a decline.
The person openly stated that they had no proof, and just believed it to be true—conceding that I had the right to believe whatever I wanted.
Herein is the problem: Most people believe and argue as ideologues, and thus, assume everyone else is arguing as an ideologue also—reducing public and social media debate to little more than a shouting match absence evidence.
The worst extremes of being ideological, for example, are racism and sexism. Racism is the idea that some races are superior to others, and racists, then, impose that idea onto the world instead of drawing conclusions about race from evidence. Sexism functions the same regarding sex/gender.
The ideologue, then, can often be discredited by evidence—except that those functioning by ideology alone refuse to move from being ideological to being informed by that evidence.
Science, often misunderstood, is a discipline designed to build better understanding through a variety of ways of thinking, reasoning:
“In inductive inference, we go from the specific to the general. We make many observations, discern a pattern, make a generalization, and infer an explanation or a theory,” Wassertheil-Smoller told Live Science. “In science, there is a constant interplay between inductive inference (based on observations) and deductive inference (based on theory), until we get closer and closer to the ‘truth,’ which we can only approach but not ascertain with complete certainty.”
We start with some idea—I think this is true about the world, or human behavior—and then we put that idea to a test. The outcome of that testing creates some foundation for anticipating how the world will work, how humans will behave.
However, those ideas grounded in evidence are then always subject to the consequences of further evidence—if the evidence reinforces the idea, it survives; if the evidence contradicts the idea, it must change.
Ideologues, resistant to evidence, become victims to logical fallacies—flawed thinking, for example:
A leading candidate would be “attribution error.” Attribution error leads us to resist attempts to explain the bad behavior of people in the enemy tribe by reference to “situational” factors—poverty, enemy occupation, humiliation, peer group pressure, whatever. We’d rather think our enemies and rivals do bad things because that’s the kind of people they are: bad….
This is attribution error working as designed. It sustains your conviction that, though your team may do bad things, it’s only the other team that’s actually bad. Your badness is “situational,” theirs is “dispositional.”…
Another cognitive bias—probably the most famous—is confirmation bias, the tendency to embrace, perhaps uncritically, evidence that supports your side of an argument and to either not notice, reject, or forget evidence that undermines it.
To refuse continually interrogating our ideas about the world against the evidence is to commit to faulty thinking, attribution error and confirmation bias, for a just a couple of the most powerful ways people become mired in false ideology and resistant to credible ideas.
Being ideological instead of informed has dire consequences. Ideological thinking created a healthcare crisis because patients believed antibiotics cure every sort of illness, and then the medical field made a market error by allowing patient demand to drive bad medical practice.
Antibiotic-resistant disease is the child of ideological over informed behavior.
The gun debate and the pursuit of safety also suffer from ideological flaws.
For example, many people argue for gun ownership, and against gun regulation, because they believe guns in the home protect their family and property.
Two aspects of this argument are important.
First, this argument conflates safety with gun ownership without investigating whether or not this is a fair association.
The personal and family safety—self-defense—argument is both rational and irrational. To desire safety is entirely rational; to cling to guns in that pursuit, once you are informed and not ideological, becomes irrational.
Thus, second, gun ownership for safety has many outcomes more common that self-defense—domestic violence, suicide, and accidental shootings (see research listed here).
At the root of many people being ideological and not informed is our basic human nature; we are causal machines as a pursuit of survival.
Humans are constantly jumping from correlation to causation because we are predisposed to making those inferences at the unconscious level, split-second decisions once necessary to survive.
Consider, again, our rush to make medical claims not based in evidence: People think being cold causes colds; however, colds are the result of the presence of viruses. (It seems worth noting we can experience cold with our senses and viruses are not recognizable to the bare senses.)
Extreme cold can lead to hypothermia, and can reduce our resistance to bacteria and viruses. But cold weather doesn’t cause colds.
To be ideological (and wrong) is easier because there is some seemingly concrete way to jumble correlation with causation; the be informed requires a willingness to step back from what we believe.
As great failures of ideology, then, we demand antibiotics and cling to guns because we have made flawed associations with both in pursuit of perfectly good outcomes—health and safety.
To be informed, and not ideological, means that we must be willing to identify what it is we are trying to understand. And then we must be willing not only to seek out evidence but also to recognize that evidence even as it goes against our initial idea—that which we have always believed.