The True Dividing Line: Part Two

“You are a slow learner, Winston."

"How can I help it? How can I help but see what is in front of my eyes? Two and two are four."

"Sometimes, Winston. Sometimes they are five. Sometimes they are three. Sometimes they are all of them at once. You must try harder. It is not easy to become sane.”

George Orwell, 1984

Last month my wife Leanne and I went to Cracker Barrel on the way home from a trip.  We noticed some “suits” sizing up one the walls adorned with American, and I wondered whether it was time for a makeover.  We were one of the youngest couples there.

When my former boss, Senator Rick Santorum ran for President in 2012, I called our target audience “Cracker Barrel Conservatives.” I think it was David Brooks (book author and political and cultural commentator) who told me that the primary battle ground counties would be those with both Starbucks and Cracker Barrel in the same zip code.

It was no surprise, therefore, to see the move by Cracker Barrel corporate to freshen up their logo, and possibly their interior, to attract younger customers. It was also no surprise to me that Cracker Barrel voters would not take the change lightly. What was a surprise, however, was reading that the controversy was stoked by chatbots, according to a report by the Wall Street Journal.  

The online logo backlash began with human-run accounts that have hundreds of thousands of followers. They started encouraging a boycott Aug. 20, one day after Cracker Barrel first mentioned its new logo… The height of activity came just before midnight on Aug. 20, when X saw around 400 Cracker Barrel posts a minute. Seventy percent of the accounts promoting boycotts at that point used duplicate messages, a key marker of coordinated bots … Bots or likely bots authored 44.5% of X posts mentioning Cracker Barrel in the 24 hours after the new logo gained attention on Aug. 20 … that number rose to 49% among posts calling for a boycott.

Studies have shown that chatbots can be AI driven, or coordinated by both domestic and international actors intent on escalating conflict and polarization. Did President Trump’s escalator stop at the UN? It certainly did. Was it an intentional act of protest as news segments and social media posts were dedicated to promoting? Most likely not.  

President Trump stopped on the United Nations escalator

As we wrote last month, for many Americans, their primary lens has become political ideology, and they are unknowingly susceptible to manipulation. The amount of disinformation and outrage social content coming from the left and the right, from internal and external sources, is already distorting our perspectives. It will only get worse as bad actors use AI to create more and more compelling misinformation and inflammatory messages. And it isn’t helpful when political leaders validate AI generated images that are created to inflame or mislead by circulating them.

In fact, no matter which AI option is your preference (ChatGPT, Grok, etc.) it most likely has a built-in bias. We are entering a time when AI generated images and content may be fashioned to feel even more compelling than reality – further deepening our bias and confirming it at the same time.

I haven’t read Orwell’s 1984 since High School, but I am working through the graphic novel version of it released last year. I was struck by how we can be convinced something is true even against our own reason when reinforced by our tribal leaders or other authority figures.   

1984

We saw this in real time after the Charlie Kirk assassination when our ideologically-formed presumptions were reinforced by viral media which furthered and fueled conspiracy theories. We are all susceptible to conspiracy theories, in part because there are actually conspiracies in the world, but the speed and validation they receive push us quickly into our polarized corners without the time we need to do our due diligence.  

AI is a tool that can and should be used to advance our common good, but without safeguards, it will contribute to the decline of the very critical thinking skills we need to ensure that we are not misled, manipulated and even made insane by the content it generates.  

We are entering a brave new world, and we need to embrace a few practices to ensure that we don’t become completely divided by it:

Diversify your sources. Chatbots have biases, and so do people. More often than not, the truth is found from multiple sources. Pulling from a diverse spectrum of sources can counterbalance an echochamber of inflammatory messages and, at the same time, provide insight into other viewpoints– the first step to finding common ground. 

Fact check. The barrier to entry for a post on the internet is low. Humans and chatbots alike are guilty of spreading misinformation. Let us be a society that values truth, by taking the time to verify it ourselves.

Be wary. Misinformation is not only pervasive, but compelling. The limitations of our knowledge leave us susceptible to conspiracy and manipulation. Though AI may be eroding our critical thinking skills, we must be slow to accept what we read as truth. 

Keep everything in perspective.  Don’t let a small issue that is possibly being inflamed by chatbots and actors profiteering from outrage become too big in your mind. There are more important things in life than whether SMU changes its mascot to a Mustang or whether Superman is too kind to animals.  

Disconnect from rage media. Rage baiting is designed to drive engagement and traffic, while bringing in potential revenue for the user. Our own engagement online is more likely when we have a negative reaction to content. Media and algorithms know this; disconnecting from these inflammatory messages is not only internally beneficial, but it rightfully abstains from supporting rage media. 

Lean into real relationships. As we spend time behind screens, let us not forget to connect with one another face to face, remembering our shared humanity. Oftentimes, it is during these moments that common ground is built.

If we don’t make the decision personally and collectively to tone down the volume, we are likely to see more, not less political violence. As Ezra Klein wrote about the Charlie Kirk assignation:

American politics has sides. There is no use pretending it doesn’t. But both sides are meant to be on the same side of a larger project — we are all, or most of us, anyway, trying to maintain the viability of the American experiment. We can live with losing an election because we believe in the promise of the next election; we can live with losing an argument because we believe that there will be another argument. Political violence imperils that

Kirk and I were on different sides of most political arguments. We were on the same side on the continued possibility of American politics. It is supposed to be an argument, not a war; it is supposed to be won with words, not ended with bullets. I wanted Kirk to be safe for his sake, but I also wanted him to be safe for mine and for the sake of our larger shared project… We are all safe, or none of us are.

Next
Next

Stories Shape Souls