Big Brother Algorithm: Newspeak in the Digital Age
“It's a beautiful thing, the destruction of words.” – George Orwell, 1984.
One of the biggest warnings against totalitarianism in the twentieth century was George Orwell’s magnum opus, Nineteen Eighty-Four. In the year nineteen eighty-four, the world is divided into three world governments and had been subjected to years of conflict, famine, and revolution. The world government of Oceania rules what was once Great Britain and the protagonist, Winston Smith, decides to rebel against the government and its ever-present symbol, Big Brother. Spoiler warning, but it doesn’t end well.
If you’ve ever read the book, you would know just how impactful its language has been on describing government overreach in modern society. Words like “big brother,” “thought police,” “thought crime,” and “doublethink” all highlight the intentional limitation of free thought, deterioration of language, and perversion of meaning.
Orwell coined the term “newspeak” to describe the use of ambiguous language the characters use throughout the book, a term that is still used to describe what people might think is an intentional perversion of language. Orwell was right. But there is one aspects of modern society that Orwell could not have predicted: community guidelines.
In a world that is ever-increasingly online, tech companies like Meta and ByteDance have begun to censor words that users are not allowed to use. On TikTok, words like “suicide” and “killed” become “unalived,” while “eating disorder,” “sexual assault,” and “self-harm” are reduced to acronyms like “ED,” “SA,” “SH.” Videos that don’t follow community guidelines and make mention of “suicide” are flagged and very quickly taken down.
So what makes this bad? While it might not seem like an intentional distortion of the English language, it can affect how language is spoken in the future. In the examples above, words like “suicide” become completely sterile and devoid of any emotion when transformed to allow videos to fit the algorithm.
Try it yourself. Think of what emotions come to mind when you hear “she killed herself” versus when you hear “she unalived herself.” The harsh emotions I hear in the first are contrasted with little more than apathy in the second. One reads like a tragedy, the other like text. One of my all-time favorite comedians, George Carlin, has an excellent bit on the topic of euphemistic language and emotion. I’ll summarize it below, but watch the actual video if you want to laugh.
Carlin argues that euphemisms and euphemistic language “conceal reality” and the these are just ways for us to avoid facing reality. He talks about how it gets worse each generation citing how, over the course of seventy years, shell-shock became battle fatigue became operational exhaustion became post-traumatic stress disorder. “I’ll betchya” he says “If we’da still been calling it ‘shell shock’ some of those Vietnam veterans mighta gotten the attention they needed at the time.” Funnily enough, it’s changed once again over time to PTSD. Completely devoid of not only emotion, but meaning. We’re now left with just a collection of four letters.
Granted, languages constantly change and evolve over time. If you go back far enough, English starts to sound like gibberish. Words become archaic, lose their meaning entirely, and new words take their place. I have a hard enough time understanding Shakespeare myself. But at the rapid rate technology is increasing, I think we will see a rapidly inverse decrease in how we speak.
Since online spaces don’t seem to be going away anytime soon, words that were reserved for TikTok and Instagram videos will slowly start making their way into our day to day. I think we should be aware that any efforts we make to avoid the censorship algorithm might impact the way we speak and think more than we might expect.