Post-Marxist Critics’ Ongoing Crusade Against It’s a Wonderful Life

Depending on who you ask, It’s a Wonderful Life is either “a fanfare for the common man” or “one of the most profoundly pessimistic tales of human existence ever to achieve a lasting popularity” depending on who you ask.[1] The movie emerged from relative obscurity after its initial box office failure in 1946 to become an American icon by the 1980s and remains popular to the present, ranking 20th on the American Film Institute’s most recent 100 greatest American films of all time industry poll.[2] It’s even “Britain’s favourite Christmas film” according to the BBC.[3] Though it might be past the peak of its 1970s-90s popularity, It’s a Wonderful Life remains important to a lot of people and is still the subject of an unusual amount of critical debate and re-evaluation.

It’s a Wonderful Life is a dark, disturbing fable about greed, exploitation, misery, and disappointment,” said New York Times chief film critic A.O. Scott in 2008.[4] Like many other critics since the film gained prominence, Scott praises the film and recommends it for reasons opposite to what Capra and Stewart intended, asserting that It’s a Wonderful Life is actually a statement about how bad things are for those who have the intellectual ability and inclination to see it. Because It’s a Wonderful Life engages views about the value of life and its relationship to society, its interpretation can act as a reflection of ideology. Those who by virtue of their place as critics must assert their tentative status as cultural/intellectual elites must show in their interpretation that their own ideology is indicative of the modern elite worldview.

While it has its roots in Marxist cultural criticism, this worldview is less prescriptive than descriptive. Intellectual ideological criticism of It’s a Wonderful Life demonstrates the use of cynical critical perspectives as a shibboleth – a token that demonstrates the critic’s place in this ideological trend among fellow intellectuals.

Frank Capra, the movie’s producer and director, didn’t intend It’s a Wonderful Life to be as ambiguous as it’s sometimes treated in interpretation: “I wanted it to reflect the compelling words of Fra Giovanni of nearly five centuries ago: ‘The gloom of the world is but a shadow. Behind it, yet within reach, is joy. There is a radiance and glory in the darkness, could we but see, and to see we have only to look. I beseech you to look!’”[5] James Stewart, the star of the film whose persona is instrumental to the story, agreed: “The story of It’s a Wonderful Life was conceived by Frank Capra from a very simple phrase: no man is born to be a failure… Frank took the idea and worked with Hackett and Goodrich on the script, and created the story which is my favorite [movie].”[6]

Capra and Stewart made It’s a Wonderful Life as their first project after their respective services in WWII. The film failed to catch on when released in December of 1946, losing money and fading into obscurity a few years later. The traditional explanation for the film’s failure is that it “didn’t match the post-Second World War mood,” but it’s difficult to determine why a film succeeds or fails.[7] Its situation was not unlike another legendary film from a few years before, Citizen Kane. Both films had vague titles no one had heard before and marketing materials that failed to explain the movie. Consequentially, they had similar life cycles: despite warm criticism and a handful of Oscar nominations, “by the end of the year it had closed everywhere, not to be seen widely again in America till RKO sold its library to television.”[8] The growth in popularity It’s a Wonderful Life experienced beginning in the 1960s is often attributed to the widespread showing of the film on television around Christmas after a clerical error led to its copyright to lapse and the film to fall into the public domain from 1974 to 1993, though this narrative may put too much weight on one factor.[9]

“The week brings its expected cargo of holiday treasures, including two showings of Frank Capra’s It’s a Wonderful Life, which may be the best Christmas picture ever made – although it plays just as well in July as December,” wrote Baltimore Sun film critic Stephen Hunter in a Christmas TV review in 1982. “The angel shows [George] the world as it would have been without him: and it’s a dark and cold and terrifying place, loveless and forlorn. This sequence is one of the great mise en scenes in American movies, and the movie is one of those rare films that aspires to inspire and succeeds.”[10] That same year, the American Film Institute gave Capra its tenth annual Lifetime Achievement Award. Stewart and Capra were getting stacks of mail praising It’s a Wonderful Life.

In the critical and academic world, however, things were changing. “Capra’s endings are, outwardly, optimistic,” wrote Glenn Phelps in The Journal of American Studies in 1979. “Capra’s eye discloses a world in which… the forces of privilege, money, and political power have all of the tangible resources with which to keep the heroic individual in line. Capra draws his portrait of American society too well.”[11] Critics were increasingly following an ideological approach to film criticism which, in the words of film/media and gender/sexuality studies scholar Maria Pramaggiore, argues that “…films that validate the American Dream discourage any analysis of the forces that work against class mobility…” and “treats the American Dream as a myth that disregards the limitations of a society that values competition more than communal responsibility.”[12]

But ideological critics have trouble rejecting It’s a Wonderful Life outright – the craftmanship of the film makes that difficult for any devotee of the cinema. Even the infamous contrarian Pauline Kael of the New Yorker grouchily conceded that “In its own slurpy, bittersweet way, the picture is well done.”[13] It’s a Wonderful Life must therefore be interpreted as a work that does not “validate the American Dream” if it is to be praised by these ideological critics. Their project is aided by the fact that movies are especially susceptible to ambiguity.

Movies include countless elements – such as actors, props, scenery, voices, music, and optical effects – often arranged in ambiguous ways. Any aspect of a given scene could be important, could be unimportant except to make the scene appear realistic, or could be relevant to an undercurrent of subtext to the scene or story. Directors often help the viewer out with techniques like a shallow depth-of-field, which focuses the camera and our attention on just one aspect of the scene. Critical moments of It’s a Wonderful Life, however, are shot in deep focus. With the wider depth-of-field deep focus photography provides, almost everything in the shot is in focus which, according to Hoi Lun Law at the University of Bristol, “requires us to work out what is significant.”[14]

The exultant final scene of the movie can act as an example of this. We can ask of the scene, what is it that makes George and Mary so happy, the money or love of family and friends? Why does this scene say It’s a Wonderful Life? The camera doesn’t tell us. Selective focus, tighter shots, or more active editing could tell the audience what about the scene is most important, but Capra shows the scene from over George’s shoulder in a wide angle with both the people he loves and the growing pile of money in focus. While a more cynical critic might say the money is the important aspect of the scene, the casual viewer responds to the friends and family that gather and celebrate together.[15]

Since non-academic viewers understand a work in emotional terms, their mood is the most explanatory variable of differences in their reception of an ambiguous work. They generally choose to focus on the elements of the film that elicit the strongest emotional connections. This mode of interpretation is behind the response to the film that fueled its rise in popular culture.

The traditional divide between high/elite and low/popular culture is gone. As Alex Ross at the New Yorker pointed out, “Opera, dance, poetry, and the literary novel are still called ‘elitist,’ despite the fact that the world’s real power has little use for them. The old hierarchy of high and low has become a sham: pop is the ruling party.”[16] Popular audiences and the elites now view much of the same art and culture. The status of “cultural elite” is expressed not by viewing more sophisticated art, but by interpreting the same art in more sophisticated ways – which in practice means ways more reflective of elite ideology. The old hierarchy of cultural consumption has been displaced by a hierarchy of cultural perception. Critical writing for academics and journalists must then express the elite status of one’s perception: an ideological framework indicative of advanced education, a fashionable worldview, and elite ideology.

The pre-eminent framework of cynical intellectual ideology has its roots in the period around WWII, which is associated with the decline of progressive optimism among the elite with the death of Hegelianism and the discrediting of traditional Marxism. Hegel’s philosophy of history was dominant in the late 19th and early 20th centuries and says that “world history… shows the development of consciousness on the part of spirit,” a progressive process toward a spiritual utopia.[17] Traditional Marxism’s view of history is based on the Hegelian, but it holds that it’s not abstract concepts like spirit that progress step-by-step toward utopia, it’s economics. Each step is a revolution, and Capitalism is just an unpleasant step on the inevitable path to the Communism at the end of history. Cultural and intellectual developments are just side effects of that progression.

But in the decade after WWII it became apparent to many that the world was more complicated than anything Hegel’s model of dialectical progression could describe, and Marxist progressive optimism was replaced by post-Marxist cynicism. The “War to End All Wars” turned out to be just a prelude to more chaos and the Soviet Union failed to bring about Communist utopia. Intellectuals had to reckon with the possibility that human “progress” only amounted to more refined forms of cruelty and destruction. The influential German-American political philosopher Leo Strauss noted that “It has been said, not without reason, that Hegel’s rule over Germany came to an end only on the day Hitler came to power.”[18]

One of the first in the new thread of cynical neo/post-Marxists was Theodor Adorno, a German philosopher and critic, along with his compatriots who formed the “Frankfurt School of Marxism” at the University of Columbia in New York after fleeing the Nazi regime. The Frankfurt School argued that culture was part of the structure of Capitalism, rather than a dependent part of the superstructure, as the “culture industry” is used by those in power to maintain control. A revolution in culture would have to accompany any economic revolution.[19]

Adorno’s most famous essay, “The Culture Industry: Enlightenment as Mass Deception,” reads like a more cryptically worded conspiracy blog, interpreting all aspects of modern culture in the west as a tool for brainwashing the proletariat. His work leaves a deep impression of cynicism bordering on despair with modern life:

“The breaking down of individual resistance is the condition of life in this society. Donald Duck in the cartoons and the unfortunate in real life get their thrashing so that the audience can learn to take their own punishment. The enjoyment of violence suffered by the movie character turns into violence against the spectator, and distraction into exertion.”[20]

An Adornoian approach to critical film interpretation would then focus on identifying how elements of the film serve as means of pacification, how the film “commodifies” culture and society, and how it replaces art with a commodity that is tantalizing but never true and satisfying. The work of the French philosopher Michel Foucault a few decades later added a unifying aspect to Adorno’s and other neo/post-Marxist approaches to cultural criticism.

For acolytes of Foucault, every aspect of life is to be understood as a node in a nexus of “power-knowledge,” serving to perpetuate structures of power while enslaved to structures of power at the same time. “Power is not something that is acquired, seized, or shared, something that one holds on to or allows to slip away; power is exercised from innumerable points, in the interplay of nonegalitarian and mobile relations.”[21] Foucault gives critical interpretation the task of identifying all nodes of culture and expressive of and as the effects of “relations of power.”

In the late 20th and early 21st centuries the influence of these relatively obscure thinkers on cultural analysis, having filtered through the levels of the academy to critics and journalists, is tremendous. Alex Ross at the New Yorker recently observed among his compatriots:

“Anyone who underwent a liberal-arts education in recent decades probably encountered the thorny theorists associated with the Institute for Social Research, better known as the Frankfurt School. Their minatory titles, filled with dark talk of “Negative Dialectics” and “One-Dimensional Man,” were once proudly displayed on college-dorm shelves as markers of seriousness…”[22]

As Michel Foucault remains the most cited author in the humanities his ideas have a profound influence even on those who don’t directly study them.[23] For example, the influence of Foucauldian analysis can be seen in a 2012 essay in which W. Andrew Ewell at Salon argued that an “economy of goodwill” is the mode through which power exerts itself in the world of It’s a Wonderful Life. “Suicide, George Bailey’s most willful attempt to leave Bedford Falls, leaves him even further indebted to the people trapping him there. Clarence shows George what life would be like if he’d never been born, and what George sees is not how much Bedford Falls owes him, but how much he owes Bedford Falls… Virtue in this scenario is not only not its own reward, it’s its own cost — a commodity that is both finite and fungible.”[24] Virtue is both an exertion of power in the Foucauldian sense, but it is also “commodified” in the Adornoian sense. One of the Frankfurt School’s most enduring critiques of Capitalism and American society is that they “commodified” everything, even intangible things like knowledge and culture. This is the cynical mode of criticism Ewell emulates when he identifies Clarence helping George to win his wings as an expression of this economy of goodwill:

“When the time finally comes for someone to do for George as he’s done for everyone else, it’s an act motivated more by personal ambition than by goodwill. The angel Clarence — who “hasn’t got his wings yet” — is sent to Earth to save George from killing himself. But as naïve and kindhearted as Clarence may be, he’s also ambitious. His trip to Bedford Falls isn’t motivated merely by charity, but by promotion: “If I should accomplish this mission,” Clarence asks the senior angel, “might I perhaps win my wings? I’ve been waiting over 200 hundred years now, sir, and people are beginning to talk.” Read: What do I get if I save this guy?

“Indeed the pursuit of his “wings” becomes Clarence’s, and eventually one of the film’s, major preoccupations. At the end of the film, George’s daughter Zuzu famously pronounces, “Teacher says, every time a bell rings, an angel gets his wings.” By which we’re reminded that Clarence’s wings, like George’s charity, are a commodity, able to be exchanged (and existent only insomuch as they can be traded) for goods and services.”[25]

Critiques like this go far beyond the usual problems of interpretation. They take what most viewers would clearly see as a joke intended to show Clarence’s inexperience as an angel and turn it into a node in a broad interpretive framework they impose on the work. Once this metanarrative framework is imposed on the movie it’s easy to find other aspects of the film that can be interpreted as part of this “economy of goodwill.”

It’s also because they read the film in Adornoian terms that many critics believe that non-critic viewers don’t notice that the film has dark moments. “How can a movie so full of pain and frustration be venerated as simply, glowingly jolly?” asked Frenando Croce at Slant. “Maybe it takes a filmmaker so fascinated with the American Dream to see how close it can be to a nightmare.”[26] Wendell Jamieson at The New York Times agreed: “Lots of people love this movie of course. But I’m convinced it’s for the wrong reasons. Because to me ‘It’s a Wonderful Life’ is anything but a cheery holiday tale… ‘It’s a Wonderful Life’ is a terrifying, asphyxiating story about growing up and relinquishing your dreams, of seeing your father driven to the grave before his time, of living among bitter, small-minded people.”[27]

More common than these more in-depth restructurings of the worldview of the film are the cynical jabs taken at various aspects of the story, or even its title. Essayist Dan Rodricks at The Baltimore Sun, for example, titled a 1990 article “It’s a Wonderful Life?” and refuted the title of the film with a long list of problems and inconveniences of modern life, like spilling M&Ms or someone taking his shopping cart.[28] “‘It’s a Wonderful Life,’ All Right – Until You Know the Rest of the Story,” wrote critic Bill Granger in the Chicago Tribune in 1988. The “rest of the story” was that Bedford Falls must have declined and gentrified since 1945.[29] Gary Kimiya at Salon took possibly the most cynical perspective, inverting the story by asserting that in the alternate world George Bailey sees where he never lived, where “Potter has triumphed, and we are intended to shudder in horror at the sinful city he has spawned… There’s just one problem: Pottersville rocks!”[30] This, too, is less of a critical analysis and more of a long stretch for an obviously cynical jab, but it expresses the ideological sophistication and bona-fides of the author’s perspective on a popular work.

Pop is indeed the ruling party, as Alex Ross asserted, and since everyone is a potential consumer of popular entertainments – including widely venerated ones like It’s a Wonderful Life – interpretation and the ideology seen to influence it are the criteria for segregating high and low culture. If there is an “authentic” way to consume culture, it is that of the popular audience – those who direct their focus based on their own preferences and emotions, those for whom interpretation is a matter of individual choice. Focus is the key to interpretation, just as it is for the fictional George Bailey, whose change in focus changes how he interprets his own life. It’s only a wonderful life for those who choose to see it.


[1] Stephen Hunter, “At the Movies: ‘It’s a Wonderful Life’ Tops Films of Good Cheer,” The Baltimore Sun, 19 December 1982; Andrew Sarris, You Ain’t Heard Nothin’ Yet: The American Talking Film, History and Memory, 1927-1949 (Oxford University Press, 1998), 356; quoted in Marc Eliot, Jimmy Stewart: A Biography (New York: Harmony Books, 2006), 427.

[2] American Film Institute, “AFI’S 100 Years…100 Movies – 10th Anniversary Edition,” American Film Institute, 2007, Accessed 17 November 2021, https://www.afi.com/afis-100-years-100-movies-10th-anniversary-edition/.

[3] BBC News, “Why It’s a Wonderful Life is the Nation’s Favourite,” BBC, 20 December 2018, Accessed 20 November 2021, https://www.bbc.com/news/newsbeat-46618522.

[4] A. O. Scott, “It’s a Wonderful Life – Critics’ Picks,” The New York Times, 9 December 2008, YouTube video, 4:19, https://www.youtube.com/watch?v=XrQFessHE2o.

[5] Jeanine Basinger and Leonard Maltin, The It’s a Wonderful Life Book (New York: Alfred A. Knopf, 1979), ix.

[6] James Stewart to Lonnie Schlein, 18 March 1982, James Stewart Papers, Box 26, Folder 4, L. Tom Perry Special Collections, Harold B. Lee Library.

[7] Pauline Kael, 5001 Nights at the Movies – A Guide From A to Z (New York: Holt, Rinehart, and Winston, 1984), 374.

[8] Simon Callow, Orson Welles: The Road to Xanadu (New York: Penguin Books, 1996), 575; The legend that It’s a Wonderful Life did not receive good reviews is “simply not true” according to Jeanine Basinger’s survey of the sources. See Basinger, The It’s a Wonderful Life Book, 54.

[9] It’s a Wonderful Life’s public domain status probably didn’t hurt the spread of the movie, but the revival was already ongoing at the time it fell into the public domain. The Google Books Ngram Viewer gives a lagging indicator of its popularity by counting the number of times its title is mentioned in all the books and articles they have indexed as a percentage of total words or phrases in all materials. Because it increases in writing about a film proceed increased viewing of a film, the numbers are a rough lagging indicator. But we can see from the data Google does have that the movie was already being written about in 1978 more than in 1947 after its release, though the uptick in interest had begun about a decade earlier. See Google Books Ngram Viewer, search for It’s a Wonderful Life (Google), accessed 19 November 2021, https://books.google.com/ngrams/graph?content=It%27s+a+Wonderful+Life&year_start=1940&year_end=2019&corpus=26&smoothing=3#.

[10] Hunter.

[11] Glenn Alan Phelps, “The “Populist” Films of Frank Capra,” Journal of American Studies 13, no. 3 (December 1979): 381.

[12] Maria Pramaggoire and Tom Wallis, Film: A Critical Introduction (3E: Boston: Allyn & Bacon, 2011), 310.

[13] Kael.

[14] Hoi Lun Law, Ambiguity and Film Criticism (Cham, CH: Springer International Publishing, 2021), 5.

[15] See Manny Farber, “Mugging Main Street – A Review of “It’s a Wonderful Life,”” The New Republic 6 January 1947, accessed 6 December 2021, https://newrepublic.com/article/98662/mugging-main-street-review-its-a-wonderful-life. In possibly the only period negative review of the film, Farber asserts that the money is the emotionally operative aspect of this scene: “Admirers of this honest, hard-working, self-sacrificing character Stewart has played since he left Princeton are going to be uneasy when they see Jimmy’s face light up like a Christmas tree at getting all this free dough.”

[16] Alex Ross, “The Naysayers,” The New Yorker (15 September 2014), Accessed 16 November           https://www.newyorker.com/magazine/2014/09/15/naysayers.

[17] G.W.F. Hegel, Lectures on the Philosophy of History, trans. Ruben Alvarado (Aalten, NL: Wordbridge, 2011), 58.

[18] Leo Strauss, Spinoza’s Critique of Religion, trans. E. M. Sinclair (The University of Chicago Press, 1965), 2.

[19] Andrew Fagan, “Theodor Adorno,” Internet Encyclopedia of Philosophy, Accessed 16 November 2021, https://iep.utm.edu/adorno/.

[20] Theodor W. Adorno, “The Culture Industry: Enlightenment as Mass Deception” in Dialectic of Enlightenment, trans. John Cumming (1947; New York: Herder and Herder, 1972), 138-139.

[21] Michel Foucault, The History of Sexuality, vol. 1, trans. Robert Hurley (1976; New York: Vintage Books, 1990), 94.

[22] Ross.

[23] “Most cited authors of books in the humanities, 2007,” 26 March 2007, Accessed 20 November 2021, https://www.timeshighereducation.com/news/most-cited-authors-of-books-in-the-humanities-2007/405956.article?storyCode=405956.

[24] W. Andrew Ewell, ““It’s a Wonderful Life”: Occupy Bedford Falls!”, Salon, 8 December 2012, Accessed 20 November 2021, https://www.salon.com/2012/12/08/its_a_wonderful_life_occupy_bedford_falls/.

[25] Ibid.

[26] Fernando F. Croce, “Review: It’s a Wonderful Life,” Slant, 16 November 2007, Accessed 20 November 2021, https://www.slantmagazine.com/film/its-a-wonderful-life/.

[27] Wendell Jamieson, “Wonderful? Sorry, George, It’s a Pitiful, Dreadful Life,” The New York Times, 18 December 2018, Accessed 20 November 2021, https://www.nytimes.com/2008/12/19/movies/19wond.html.

[28] Dan Rodricks, “It’s a Wonderful Life?”, The Baltimore Sun, 8 January 1990.

[29] Bill Granger, “It’s a Wonderful Life,” All Right-Until You Know the Rest of the Story,” Chicago Tribune, 24 January 1988.

[30] Gary Kamiya, “All Hail Pottersville!”, Salon, 22 December 2001, Accessed 20 November 2021, https://www.salon.com/2001/12/22/pottersville/.

“Infrastructure” Bill: Can We Trust the Government With Anything?

While the media are occupied with fake news on the border and Biden and Psaki are keeping the public busy with vaccine mandates, Bernie Saunders and his comrades are working to get the 50th vote out of Joe Manchin for their 3 trillion dollar supplement to the already 1 trillion-dollar “infrastructure” package.

Just as the infrastructure bill also became a bill to tax, regulate, and spy on cryptocurrency activities, the budget reconciliation bill became a became a jobs bill, and progressed to be the Green New Deal beta.

In the U.S. we theoretically have a republic, a system in which the scope of laws and extent of authority is limited by textual contractual agreement. But what good is any contract in a world where “infrastructure” means the creation of a “Civilian Climate Corps” and “nor shall any State deprive any person of life, liberty, or property, without due process of law” means “free State-sponsored infanticide”?

Many Americans, even ostensibly constitutionalist conservatives, are okay with the federal government maintaining state roads and bridges. If constitutionality ever comes up, they’ve long been happy to accept the interpretation of the power “To regulate Commerce with foreign Nations, and among the several States” as “infrastructure.” But what’s the point of enumerating powers if they are interpreted to mean whatever is politically convenient? What’s the point of writing laws if administrative agencies can interpret them to mean whatever the bureaucracy wants them to?

It’s not as if ambiguity and hermeneutical conflict are a unique function of post-modernism that weren’t issues before the twentieth century. The fundamental conflict for Socrates was with the ancient Greek Sophists who distorted the relationship between meaning, understanding, and reality through verbal trickery. The practical Sophists were, in fact, lawyers who specialized in helping those involved in legal disputes to reinterpret the law and contracts to mean whatever they want them to, even the opposite of their original intent.

But conflicts over interpretation don’t necessarily even involve bad-faith trickery, and meaning can be twisted or even inverted by perspective. Whether the simple phrase “This [bread] is my body which is given for you: this do in remembrance of me” (Luke 22:19) means that the Holy Communion is literally human flesh has been the subject of exegetical dispute for almost 2000 years, including one argument between Luther and Zwingli that allegedly left the latter man in tears.

The concept of the impartial judge who weighs the best arguments from both sides is an ancient but imperfect solution to this problem. In common-law courts, questions regarding the facts are decisions for the jury, while questions regarding the law are left to the judges. But when the government is a party to disputes regarding its own authority, interpretation is entrusted to judges the government itself appointed and chose from those who and are trained in and show the highest levels of mastery of the technical application of sophistry.

Ultimately, the government cannot be trusted to uphold deals we make with it. The government, at best, streamlines the exercise of certain rights, but it can never be counted on to protect those rights. In being given the charge of protection of those rights the state is given a claim to interpret those rights, meaning that all rights are understood as coming from or at least flowing through government rather than from God or the divinity within humanity.

As the Wall Street Journal Editorial Board has documented, the $3.5 trillion cost estimate is far below the actual costs of the bill. But even that might not be enough for the far left in the House, possibly derailing the whole scheme. So our best defense against the government might not be the law or the constitution, but the incompetence and radical infighting that always seems to accompany it. But this defense is temporary at best.


‘1820’ Could be the Best Latter-Day Saint Musical Since ‘Saturday’s Warrior’

But for now, the search for the great Latter-Day Saint musical continues.

The closest to ever be achieved was by Lex de Azevedo and Doug Stewart’s ‘Saturday’s Warrior,’ which bounces from the sublime to the sentimental to the painfully cheesy and back again. Despite containing nuggets of musical and thematic triumph, it makes for a lopsided package with a pointlessly tangled set of narrative threads. Azevedo’s other successful Latter-Day Saint musical, ‘My Turn on Earth,’ is cut from the same uneven cloth, juxtaposing lines like “the most precious gift we have been given next to life itself is the power to direct that life” with lines like “John, you’re not really Satan, we’re just pretending!”

BYU Professor of Theatre George D. Nelson’s new ‘1820: The Musical’ is more consistent and less likely to cause the audience to cringe, but like ‘Saturday’s Warrior’ and ‘My Turn on Earth,’ it’s a musical with great songs and moments and yet fails to be a great musical. The songs, written by a team including Kendra Lowe Holt, Kayliann Lowe Juarez, and Doug Lowe, depict various vignettes from the life of the prophet Joseph Smith and Emma Smith and often succeed in capturing the emotions of moments in the story. Nelson’s book, however, is unable to make a plot out of these moments (despite the book being written before the musical, according to Nelson).

In our Reactionary age, it seems that all criticism is interpreted as hate. ‘1820’ could be a great musical, but the problems that will hold the musical back from wider success are downstream from the key issue that it has no plot. To say that ‘1820’ has no plot is not necessarily synonymous with saying that I hate the musical. Lots of things happen in ‘1820,’ some of them in good scenes, but they aren’t connected by cause and effect. ‘1820’ is a series of scenes about the prophet’s life, ordered only by chronology with no sense of storytelling or thematic purpose.

This lack of purpose is what makes some choices in the production itself so baffling. For example, the choreography by BYU contemporary dance instructor Adam Dyer features elaborate group numbers on the “showstopper” songs, but for most of the production has the extras doing a lot of dopey crawling around on the floor and creeping around the set in faux slow motion before pushing and pulling the principals around for no discernible reason. Outside the carefully rehearsed big dance numbers, the dancing extras convey amateurishness in their very movements.

‘1820’ is compared to ‘Hamilton’ in its marketing materials, which explains the otherwise bizarre use of a minimalist set and effects alongside more elaborate lighting effects. ‘1820’ also imitates ‘Hamilton’ in its use of race-reversed casting, and Conlon Bonner delivers an emotionally compelling performance as Hyrum Smith. Musically, I won’t hold the comparison to ‘Hamilton’ against ‘1820,’ which fortunately includes only one big rap sequence, a vaguely 80s-style number that could be mistaken for the work of Will Smith, or maybe MC Skat Kat.

The highlight of the cast is Zach Wilson in the role of Joseph Smith, a triple threat who commands each scene dramatically, has the best male singing voice on that stage, and dances with rhythm and dexterity across multiple genres and styles. The songs are generally similar to those in ‘The Greatest Showman,’ catchy pop tunes with simple lyrics that summarize the basic emotional beats of their scene. Individually, many of them succeed quite well.

‘1820’ just completed a six-week opening engagement at the Covey Center in Provo, opening with a blitz of advertising and attempts to gain a following through Latter-Day Saint “influencers.” The producers say they have an eye on bringing the production all the way to Broadway, but in its current state, it’s not ready for New York. Catchy songs alone won’t make it anywhere without a really big name attached.

‘Hadestown,’ last year’s Tony winner, took a fifteen-year trip through workshops, previews, local productions, and even a concept album before becoming Broadway’s big hit of 2019. If ‘1820’ is to continue along a similar path, it needs serious improvements before we see a better version on a regional tour. Nelson has suggested that ‘1820’ was written in part as a response to ‘Book of Mormon.’ I would suggest then, that in future performances when the cast sings the closing number “I’m Still Here,” the cast holds up copies of the Book of Mormon to accompany their defiance of those who condemn and ridicule the prophet and the faith, both past and present.

‘1820: The Musical’ Official Soundtrack Album Cover (Amazon)

The soundtrack recording, released in advance of the stage debut of the musical, stands on its own and could become a niche favorite independently of the stage production. It’s available on Spotify, YouTube, and other streaming services. Start with “Who is This Man?,” “Alive in Christ,” “All About Timing,” and “I’m Still Here.”

For Mayor de Blasio, CRT Really is Just a Tool

Last week, Mayor Bill de Blasio of New York City announced the first wide-scale vaccine passport system in the U.S., the so-called “Excelsior Pass.” The few objections from the left to this scheme are not concerned with the inherent impracticality and immorality of regulating personal behavior, but with the disproportionate impact the requirement will have on certain minorities. In the Critical Race Theory framework, that makes it inherently White Supremacist.

A few grifters have come out of the woodwork to call vaccine passports racist, most notably the mayor of Boston. But this time Twitter and the media have generally decided that it isn’t really worth hearing about this time. Apparently, racism is no longer a bigger “public health crisis” than Covid, unlike a year ago when Covid deaths were much higher. The most visible manifestation of these arguments is actually a parody/hoax that circulates under #askmewhy, courtesy of the 4Chan trolls. They are are ridiculous, but at the same time are a pretty convincing facsimile of actual arguments used by BLM and similar groups about other topics.

But the case that vaccine passports are White Supremacist are indisputable if you understand the world through the Critical Theory framework. 39% of black New Yorkers are vaccinated versus 53% of whites, a ratio that is similar to the national average, though each state can differ widely. This difference means that a policy requiring proof of vaccination from everyone is systemically racist. Pointing out that vaccination status is an objective, racially-neutral standard is “the myth of color blindness.”

The parts of critical legal theory and critical race theory that focus on law and policy are founded on the quest to show that seemingly neutral laws and universally beneficial norms are actually systems of racism. Their modus operandi is to find any inequity – any statistical difference between races other than average melanin – and connect it to a law/policy/norm that has a “disproportionate impact.” By dint of having a disproportionate impact, that law/policy/norm is assumed to be rooted in White Supremacy.

For example, if a bar in New York checks everyone’s proof of vaccine along with proof of age at the front door, this would mean fewer Black and Latino patrons would be allowed entry because fewer of them are vaccinated. Requiring proof of vaccination is White Suppremecist because it has a “disproportionate impact” because those allowed entry would include a disproportionate number of Whites and Asians. It’s just like making the bar “whites only,” except somehow, the structures of systemic racism erected to perpetuate White Suppremecy find a way to favor Asians, 82% of whom are vaccinated. 

These are the logical implications of the Critical Race Theory that Democratic politicians found so useful to hit their opponents with back in 2020 to rally their base and scare independents. If it wasn’t obvious six months ago it should be obvious now that the politicians who claimed that requiring ID to vote was racist didn’t really mean it. Some of them are even claiming the bogeyman Fox News made that up out of thin air. If voter ID is “Jim Crow 2.0” then vaccine passports are Jim Crow 3.0.

On average, fewer Blacks choose to get vaccinated than Whites, which is the obvious cause of the discrepancy in the impact of vaccine passports. There’s a cultural difference underlying this difference in choices, and it’s far more tangibly linked to the racism of decades and even centuries past than to modern policies.

The legacy of slavery and discrimination in the 100 years after emancipation has led to the disproportionate prominence of what Thomas Sowell identified as “redneck culture” in Black communities. Southern and transplanted inner-city Blacks inherited culture and language from the Scotch-Irish immigrants who constituted most of the poor Whites they lived alongside in the south. “Black ghetto” culture, according to Sowell, is rooted in the dysfunctional “redneck culture” rather than embedded in the skin. (See Thomas Sowell, Black Rednecks and White Liberals for a look at the cultural influences of “redneck culture” and J. D. Vance, Hillbilly Elegy, for a look at this culture in modern poor White communities.)

Looking at vaccination numbers, the prevalence of redneck culture is a stronger predictor of vaccination than race. New York Blacks have similar vaccination numbers (39%) to Mississippi Whites (38%) and Mississippi Blacks (39%). Culture, not race or racism, or even government policy, can explain most if not all of the discrepancy in vaccination numbers.

But critical theory adherents like Boston Mayor Kim Janey prefer to attribute all discrepancies to racism. Janey compared vaccine passports to slavery-era freedom papers and birtherism. “There’s a long history in this country of people needing to show their papers,” she said, referring to New York’s policy.

Mayor de Blasio, who just five months ago announced a Racial Justice Commission to dissect New York City’s charter in search of structural and institutional racism, now says Janey and other activists accusing his policies of perpetuating structural racism – policies that very clearly have a disproportionate impact – is “absolutely inappropriate.”

Politicians like de Blasio have scoffed at the Critical Racial Theory activists they “stood alongside” last year, now that those activists might hinder, rather than help, their attempts to exercise control. It turns out that for those who crave power, CRT really is “just a tool.” Not a tool for academic analysis, but a tool to try to “dismantle systems of oppression” that get in the way of politicians’ consolidation of power, like the Senate Fillibuster. But critical theory is a dangerous tool that may even have a mind of its own, as some cynical politicians may find out.


Zuckerberg is the First to Stop Clapping

Ken Burns says that Mark Zuckerberg is an enemy of the state. As an actual enemy of the state, I resent that.

In an interview with New York Times’s Kara Swisher, the legendary documentarian hinted that Facebook’s upper management, including CEO Mark Zuckerberg and COO Sheryl Sandberg, should be treated like the former Nazi leaders. “The Nuremberg of this, is if it ever happens, which it won’t, will be pretty interesting,” said Burns.

Strangely, some conservatives agreed with Burns’s assertion that Zuckerberg “belongs in jail.” The right hates Zuckerberg because of Facebook’s efforts to combat “misinformation” through censorship and fact-checking. The left hates Zuckerberg because he hasn’t purged all non-leftist thoughts from Facebook yet. But the conservative comments section intelligentsia seems to think that Burns is right simply because he hates Zuckerberg like they do, even if it is for the exact opposite reason. Burns is parroting the standard talking points dictated by the mainstream media/DNC. Biden similarly said about a week before that Facebook is “killing people” by not censoring enough.

There’s a story in Aleksandr Solzhenitsyn’s Gulag Archipelago about a Communist Party Meeting in Moscow where, after passing a resolution affirming the local party’s loyalty to Stalin, everyone stood fiercely applauding, clapping and yelling, for over ten minutes, as no one wanted to be seen as the first to stop. Finally, the director of the local paper mill sat down, allowing the applause to end at last. Later that night, the NKVD showed up on his doorstep. “In this way, independent men are known,” writes Solzhenitsyn. (Aleksandr Solzhenitsyn, Gulag Archipelago, trans. Adaisa A. Jesus, p. 121-123)

The revolutionary left always cancels their own. The Russian Gulags held very few actual counter-revolutionaries and “enemies of the state.” The NKVD was able to exercise such total control because anyone who betrayed the slightest hint of only being lukewarm to the cause would disappear soon after.

Despite the rapid and radical shifts in Democratic and popular attitudes toward free speech, despite the bogus fact checks, the algorithm manipulation, and the general hostility from Facebook, the Daily Wire is not only still on the platform but is also the most popular news publisher on Facebook. Zuckerberg has not eliminated all conservatives. That’s why Burns is required to hate Zuckerberg. That’s why he is to be labeled a counter-revolutionary. He has failed this loyalty test.

This reliance on loyalty tests is not just a feature of the left. In any group with a shared ideology there is a tendency for members to want to prove their loyalty, so as not to be suspected of doubt, and for busybodies in the group to look for those signs of doubt and possible dissent. It happens with libertarians and conservatives all the time, and even Alex Jones is suspected by some hard-core Conspiracy Theorists of not being fully committed to the Conspiracy Theory ideology and therefore a disinformation agent because he doesn’t blame everything on the Jews.

It may be a common or even universal human problem, but it becomes dangerous when these ideological groups have the power to send dissidents to the Gulag or excise them from the culture. Libertarians and conservatives should try to create movements that tolerate doubt, cultures that don’t want to destroy people for imperfection.

Do we destroy those who doubt, or do we help them and lift them up? The teachings of Christ are an example of the latter. As President Dieter F. Uchtdorf said, “One of the purposes of the Church is to nurture and cultivate the seed of faith – even in the sometimes sandy soil of doubt and uncertainty. Faith is to hope for things which are not seen but which are true.” Believing in true and false and right and wrong does not task us with destroying others for their imperfections.

I can’t read Mark Zuckerberg’s mind. I think that while he’s a natural liberal, deep down in his mind there are principles of free speech that he picked up from the 2000s programming culture, principles that Jack Dorsey might also have shared before he joined the hippie commune. But whatever Zuckerberg does at this point, he’s already marked as someone not properly dedicated to the leftist cause. Solzhenitsyn wrote about the fate of the man who sat down at the Communist Party rally: “With extraordinary ease they gave him ten years, claiming a totally different motive. But when he had signed the summary of charges, the investigating judge reminded him: never be the first to stop clapping.”



Top Image: Party officials applaud at the 15th Congress of the All-Union Communist Party in 1927. Joseph Stalin is in the bottom right.

Facebook Censorship Empowers Falsehoods

Facebook has ramped up its efforts to control the flow of information since Biden’s proclamation two weeks ago that it was “killing people” by allowing people to post “misinformation” about Covid-19 and vaccines. Now not even job posts that mention the need for potential employees to be tested or vaccinated can escape the dreaded “Visit the COVID-19 Information Center” banner. It’s gotten to the point where they are no longer just suppressing incorrect facts, but unapproved opinions as well.

Because of this, many people are reaching a point where they consider official disapproval to be a mark of truth. Conspiracy theorists – who have always posted and shared made-up crap under the radar – held this view for years. I suspect that the modern flat Earth movement came into being when someone was talking to their conspiracy theorist buddy and argued that “Not everything the official sources say is necessarily false. Just because they say the Earth is round, that doesn’t mean you can really believe it’s flat.” To which the conspiracy theorist replied, “hold my beer.”

I doubt Samuel L. Jackson actually said this, but it somehow seems more credible if you imagine him saying it in his deep voice. It might be even more credible with a picture of Morgan Freeman.

If we can accept the possibility that at least on rare occasions official sources might get something right, then we need to evaluate official claims on the strength of their evidence. Ideally, social media could be a great place for that process. Someone posts a claim, someone asks for evidence, the original poster or someone else posts evidence, someone replies to that with conflicting evidence or pointing out problems with their evidence, and so on, and if we could do this in a civilized manner, we might get closer to the truth. Conversation is necessary for understanding, scrutiny is the source of scientific knowledge, and opposition is the forge of experience.

Scrutiny often dismantles falsehoods, which is why – back in the days when there were lots of conspiracy theory groups on Facebook – they were careful to block any dissenting voices from their page. Rational, evidence-based discussions are the bane of both conspiracy theories and government propaganda, which is why censorship of rational discussions empowers both. Censorship doesn’t hurt conspiracy theories much, what they lose in Facebook traffic to their pages they more than make up for with the strange new credibility the censorship has given them and the “secret hidden knowledge” they can implicitly or explicitly promise.

This is a great way to advertise a boring video.

Social media censorship often decreases the reach of true stories the mainstream media/DNC disagrees with, like the Hunter Biden emails or Covid-19 Wuhan lab-leak theory, but not so much that we haven’t heard of them. The most damage is done to open, skeptical discussion and exchange of intellectually diverse materials and ideas. Conspiracy theories can be marketed as “the video THEY don’t want you to see,” but the truth usually doesn’t get the benefit of that exciting heading.

Facebook censorship empowers not only conspiracy theories, but run-of-the-mill rumors. For example, I recently came across a claim that a conservative boycott of Coca-Cola over their noted support of critical race theory had successfully damaged their business.

This claim could be true, but I was skeptical, if only because of the fact that conservatives are notoriously lousy at following through on boycotts. The top-down control approach to baloney like this is to hide it, make it blurry to the audience, and/or to reduce its reach. But maybe there’s evidence. The rational, decentralized, libertarian approach is to ask for a source, which I did.

A traditional way of asking for sources. Comic from xkcd.com

Facebook blocked my comment which was asking for a source because social media censorship is designed to prevent all controversial conversations. Any possibility of pointing out the need for a source or trading evidence with the original poster is automatically ruled out.

Of course, social media companies and news organizations are private companies and can legally censor whoever they want, as many libertarians will constantly remind everyone who will listen (though the government’s finger is clearly on the scale at this point). But libertarians should also know better than anyone else that you can’t derive morality from legality. And you can’t derive efficacy or practicality from either.

Rumors, hoaxes, and conspiracy theories about Covid-19 vaccines continue to thrive, despite (or possibly because of) social media’s censorship. But without the censorship, we might have a rational discussion about how necessary the vaccine is for certain populations or the need for mask mandates, and that’s even more unacceptable. Without censorship, we might be able to discuss the issue, to trade evidence, and maybe to convince each other of the right position. Or we might not, as humans often let their emotion override their reason. But it’s better than letting falsehoods go unchallenged by having no discussion at all.


The Authoritarian Moment is an (Incomplete) Complete Formulation of Shapiro’s Thesis on the New Left

There’s a problem with books from radio/podcast hosts: they usually contain nothing that the host hasn’t already talked about a dozen times. If you listen to the respective hosts’ shows a few times a month, there’s nothing to set books like Rush Limbaugh’s See, I Told You So, Glenn Beck’s An Inconvenient Book, or Sean Hannity’s Conservative Victory apart from their daily radio shows except for a new framing device. The talking points remain the same, but at least Beck once had a sense of humor.

Books like this can be useful when they outline a complete version of the author/host’s worldview or their diagnosis of the current situation that we can then critique. Ben Shapiro is cleverer and better educated than any other conservative radio host, so he recognizes this need to write a book that defends a thesis about the social/political world in a complete and concise way.

Naturally, this means that The Authoritarian Moment, Shapiro’s newest book, is formed from a selection of the previous year’s worth of the talking points from his podcast. The fact that there’s nothing new is the natural consequence of talking nonstop for three hours a day. The Authoritarian Moment shapes Shapiro’s ideas into an overall theory that a series of recent trends perpetuated by the new left constitute an authoritarian push to silence dissent. These trends are obvious to anyone who has observed American culture and politics recently: the crackdown on social media, the acquiesce of corporations to woke demands, the bastardization of science, and the takeover of the academy, among others. Shapiro attributes these trends to a few social and psychological factors, like “renormalization,” ultracrepidarianism, the transformation of openly partisan news into partisan news that claims to give the unbiased truth, and the conjunction of the revolutionary instinct with the utopian instinct.

“Trump might have authoritarian tendencies,” writes Shapiro, “but he did not wield authoritarian power.” There’s a problem of definitions in the book that Shapiro seems to be aware of but is not capable of solving. We generally have a understanding of authoritarianism that involves the use of violence, threat of violence, or the use of government power – which is an implicit threat of violence. But the old leftist game is to confuse speech with violence, voluntary acts with fascism, everyday influence with authoritarian power.

If we’re going to create a new meaning of authoritarianism, one that includes non violent, non-state actions, we need to clearly define the new meanings of the term authoritarianism in opposition to each other. What is the difference between authoritarian instincts and authoritarian power? It’s a tricky question, and Shapiro isn’t quite able to give a satisfactory answer. But this matters, because it will take rational arguments within a logically consistent framework gain back ground in the war of ideas.

Shapiro attributes the takeover of certain institutions, like academia, to “renormalization,” a process in which the loudest and most stubborn in an institution are able to shift the status quo by intimidating those who want to take the path of least resistance into going along with their insane new normal. “The squeaky wheel gets the grease,” as the saying goes.

This might explain why some members of administration cave to the radicals, but is renormalization really adequate to explain the total purge of the universities? How does renormalization work on notoriously intransigent groups like Burkeian conservatives, philosophical Pragmatists, and even classical economists? And why, for the entire twentieth century, were so many intelligent members of the academy intellectually unable to contend with the philosophical equivalent of snake oil-peddling quacks? Shapiro isn’t necessarily wrong here, but he’s trying to use a single theory to explain too much.

The Authoritarian Moment is a good guide/reminder of some of the insanities of the last couple of years, like the cancellations of James Bennett, Gena Carano, Barry Weiss, and the Covington students. Shapiro endeavors to connect these by a common thread. But he avoids going into the weeds to refute some of the core ideas behind this ideology. Critical theory epistemology underlies their “ethical” argument for silencing dissenters while Karl Popper’s idiotic “Paradox of Tolerance” in various forms underlies their practical argument. At some point, conservatives might have to stop talking about how crazy the people who advocate these ideas are and actually refute the core ideas themselves.

Every book like this has some kind of call-to-action in the short last chapter, suggesting how we might fight back against the evil that constitutes 95% of the Book. “They can’t cancel us if we don’t let them” is a good rallying cry, but it brings up a difficult problem. Do we let them cancel neo-Nazis? Would we cancel an anti-Semitic “Black Hebrew Israelite”? Where do we draw the line? Should those who suggest Nazis shouldn’t be canceled be canceled themselves? Should those who suggest that those who suggest that those who suggest that Nazis shouldn’t be canceled shouldn’t be canceled shouldn’t be canceled be canceled?

In his commentary elsewhere about the whimsical mandates of government entities regarding Covid-19 masks and lockdowns, Shapiro often speaks of the need of a “limiting principle.” What is the limiting principle in regards to what speech should get someone canceled? Can we draw the line at advocating violence? If that were the case, we could cancel people for advocating war in the Near East, enforcement of drug or firearms law, or BLM riots. If the standard for cancellation is only societal norms, then anyone with minority views outside the overton window should be canceled. The canceled can only complain that society’s norms have changed while looking in from the outside.

Maybe he’s suggesting that we should cancel no-one, and be tolerant and friendly with those who have evil beliefs. But if that’s what Shapiro is advocating, then he needs to actually say it. If not, what consistent principle protects conservatives but cancels actual real-life white supremacists? It a question that needs to be reckoned with if there is to be a cohesive resistance against the authoritarian left, and Shapiro leaves this important one unanswered.


Defy social media’s authority with our email list!


Top Image: Children line up in front of a mural in Pyongyang, North Korea. Photo by Thomas Evans

Teachers Unions Object to Teaching Critical Theory, as that Might Require Teaching

Brought to you by the Seriously, Amazingly True Information Reporting Extravaganza

In an unexpected reversal, representatives from ten of the U.S.’s major teachers’ unions reversed their position regarding the teaching of critical theory in public schools.

“We have been informed by the media that critical theory is just an academic framework used by legal scholars to analyze disproportionate impacts in law,” said a spokesperson for the American Federation of Teachers. “We cannot imagine placing the burden of explaining to students what those big words mean on our already overworked teachers.”

“We will continue to uphold the values of the AFT: that Black Lives Matter, women’s rights are human rights, science is real and racist, no human is illegal except Cubans, love is love, kindness is everything, and semantically overloaded slogans are preferable to rational discussion of complex issues. We will continue to do our best to ensure that all students can chant these values on demand, but our teachers are simply too underpaid to add teaching to this heavy task.”

The U.S. Secretary of Diversity, Equity, and Inclusion criticized the move, saying that it was a “privileged attempt by mostly straight, white, cisgendered teachers to preserve the white supremacy inherent in education,” pointing out that “teaching critical theory doesn’t count as teaching because teaching, by definition, asserts the dominance of white colonialist cisheteropatriarchal narratives which critical theory seeks to dismantle.”

The AFT responded that “we remain committed to dismantling whiteness.” They mentioned that they have a legal fund “ready to go” to expedite the removal of any teacher “who tries to indoctrinate students with white supremacist creeds like the skeptical evaluation of historical narratives based on facts.”

A compromise where students would be assigned books – like Critical Race Theory by Richard Delgato or History of Sexuality by Michel Foucault – but teachers would not be expected to teach students to skeptically evaluate them was rejected when someone pointed out that the average high school student in the U.S. reads below a sixth-grade level. Antiracist Baby by Ibram X. Kendi has been suggested as a possible alternative.


Top Image: Abandoned school in Pripyat, Ukraine. Photo by Jorge Fernández Salas.


Not S.A.T.I.R.E: Dive into woke critical theory’s intellectually upside-down way of thinking in our newest essay, “Woke and Woker: The Shared Thought Processes of Conspiracy Theory and Critical Theory


Woke and Woker: The Shared Thought Processes of Conspiracy Theory and Critical Theory

Wake up, sheeple!

The least productive hobby I’ve ever had was arguing with Conspiracy Theorists in the YouTube comments – particularly the ones claiming that the Jews run the world and might also be reptoid space aliens. For a long time, I couldn’t figure out why I wasn’t making any progress. After all, I could find or come up with what I thought was an extensive, logical, and verifiable response to any of their silly arguments. My responses fell on deaf ears, and as a result I eventually swore off all comment section arguments.

Maybe my writing wasn’t as brilliant as I thought it was at the time, but I had no way of knowing that because they never actually responded to my arguments. Usually they either switched to another similarly silly argument or called me a “shill” for whatever company or organization they said was behind the conspiracy. Sometimes they insisted I was Jewish, even when I told them that I unfortunately did not have that honor.

The common catchphrase among the hard-core Conspiracy Theorists was “wake up, people/sheep/sheeple!” or “open your eyes!” There are invisible structures of power and hierarchy that determine how the world really works, you just have to wake up to see them. That’s why hard-core conspiracy theorists sometimes were called and even called themselves “woke,” before the term became more well-known to refer to the newly ascendant critical theory-based movement of the left. And despite their opposition – a result of their cultural differences – woke critical theory has a lot in common with woke conspiracy theory in their epistemology – how they think about knowledge.

There are many theories, accusations, and suggestions that a conspiracy may exist, but a Conspiracy Theory worldview – which I emphasize with a capital C and capital T to set it apart from an ordinary accusation of a crime that involves multiple conspirators – goes beyond that: it interprets every important event or relevant piece of information as constructed by a system of power under an evil conspiratorial group. Any facts or arguments that contradict their theory are interpreted as disinformation from the conspiracy. The person arguing the conspiracy theory is in league with the evil group, and therefore any evidence against their theory is actually evidence that their theory of power is correct. A conspiracy theory, in this sense, is unfalsifiable circular logic, and therefore disconnected from truth and reality. It can’t be disproven, but there are countless reasons to doubt it.

The loosely connected group of dangerous ideas that are overtaking the culture and institutions of the U.S. and Europe, sometimes referred to collectively as “critical theory,” after a part of its academic origins, operates on the same principle, and its advocates are commonly called the “woke,” because they’re supposedly awake to how the world really works, to the networks of power that dominate the world and brainwash all the people who are asleep into disagreeing with their theories.

“Privilege,” most commonly in the form of “white privilege,” “male privilege,” and “cisheteronormative privilege,” is the most widely recognized manifestation of this “network of power” that controls the world. But the term is not widely understood. It does not necessarily mean that one is blessed with relative financial prosperity or in any other way, which is why a white male hobo is “privileged” over Oprah. “Privilege” according to critical theory is about how one person’s way of thinking is privileged, meaning that it controls how both whites and those who have been brainwashed by white colonialism think. There is a white way of thinking that controls the world like the Illuminati.

If you argue against this, that’s evidence of your privilege, and proof of how far we have yet to go to achieve “epistemic justice.” Arguments against white privilege are proof that the arguer is infected by white privilege, and therefore evidence of the existence and dominance of white privilege. Those who believe this are “woke,” because they’re supposedly awake to how the world really works, to the networks of power that dominate the world and brainwash all the people who are asleep into disagreeing with their theories. Once again, we have unfalsifiable circular logic.

New Yorker cartoon by Ben Schwartz. Critical theory is sometimes opaque even to mainstream liberals who are expected to know the language.

There are many people, possibly a majority of Americans, who casually accept the worldview of either Conspiracy Theory or critical theory but haven’t skeptically investigated that worldview’s core or thought about its radical impacts. Such people genuinely believe that these radical worldviews are simple and common-sense assertions: that we should distrust those in power and that we should treat people kindly.

The hardcore activists of each group tend to retreat to one of these moderate positions when someone fights back against core premise of their radical worldview. Hardcore Conspiracy Theorists conflate something perfectly obvious: that bad people sometimes work together, with their theory as a whole: that some evil group rules the world and controls everything. Hardcore critical theorists conflate something perfectly obvious: we should oppose racism and treat everyone with kindness, with their theory as a whole: that all our identities and knowledge are a function of our position relative to oppressive power structures.

Important debates often are derailed in the arguing against terminology phase before they can ever make any progress toward the truth. Sometimes, that’s by design. Some conspiracy theorists will insist “it’s not conspiracy theory, it’s a conspiracy fact,” and claim that the term “conspiracy theory” was invented by the FBI or CIA to discredit those who had learned “the truth.”

There are lots of different names for critical theory/”wokeism,” all of which are “problematic” for some reason or another. It’s not critical race theory, that’s an “academic analytical tool.” It’s not Cultural Marxism because apparently that’s just an “anti-Semitic conspiracy theory.” What makes it an anti-Semitic conspiracy theory? The anti-Semitic conspiracy theorists sometimes use the phrase, therefore, it doesn’t exist. (For more about the context in which Cultural Marxism naturally exists, see my essay “Autonomy, Power, and the Possible: A Brief Intellectual History“). “Identity politics” is a useful term, but it can refer to political demagoguery on the basis on any identity, while only certain identities are allowed to be elevated in critical theory. The critical theory woke have sometimes been self-identified as Social Justice Warriors or SJW’s, though now it is apparently “unpersoning” to call them that.

I also would prefer eliminating use of the word “theory” in “conspiracy theory,” and in “critical theory,” as the word implies more intellectual rigor in these subjects than actually exists. If it were up to me, we’d call them “conspiracy guessing” and “critical racism.” But if we are ever to discuss a topic, we have to use words as commonly understood and endeavor to clarify when they are potentially ambiguous, and not change the words of meanings to sabotage the possibility of good-faith discussion.

The woke of both sides are adopting an age old understanding of rhetoric that is sometimes held up as a principle of the critical theory approach to knowledge – that terminology can bypass logic, manipulating ethos and pathos. Antifa can’t be fascist, is has anti-fascist right in the name!

Since the days of the ancient Greek Sophists and probably long before, humans have known the importance of controlling the terminology in controlling an argument. For those thinkers who believed there is an underlying reality that humans can access, or that there are universal laws of math and logic, arguments had to be classified in order to separate logic from the other stuff, Aristotle therefore distinguished between the three modes of persuasion: ethos, pathos, and logos. Ethos is an appeal to the authority of the arguer and their sources, pathos is an appeal to the emotions of the listener, and logos is the appeal to logic and empirical data – or an attempt to fabricate or confuse it.

The most skilled rhetoricians have always known that ethos and pathos are the most effective ways to influence people, and are maximally effective when disguised as logos.

Ethos works in two ways, we can claim that something is good because Reverend King said it, or we can claim that something must be wrong because Hitler supported it – like neoclassical architecture, vegetarianism, Wagnerian Opera, or motherhood. Some of the most common arguments we encounter on an everyday basis are in the form of a negative ethos, the thought process that says cultural Marxism doesn’t exist because the anti-Semitic conspiracy theorists say it does. Some of the “logical fallacies” you may remember if you’ve taken a writing class are ways to categorize illogical uses of ethos: appeal to authority, poisoning the well, genetic fallacy, ad hominem, etc.

The use of ethos has changed as our perception of what constitutes authority has been subverted by the common contemporary mindset. Among conspiracy theorists the sources we would traditionally regard as authorities – scientists, seasoned professionals, articulate thinkers – are regarded as less than worthless. Expertise is a marker of involvement in the conspiracy, and logical, well-reasoned, and evidence-based arguments are sometimes rejected by conspiracy theorists on the basis of the arguer’s expertise. As one flat-Earther told me, “It says the same thing on NASA’s website, so I know it’s fake.”

To reject all arguments from ethos in favor of investigating all claims logically/empirically is the ideal, though it is difficult to actually practice. Conspiracy theorists are notoriously uncritical about their sources (for about a dozen concentrated examples, see my essay “Fact or Famine”) if they come from someone who they already agree with.

Woke critical theory gives an academic gloss to that same age-old mental bias that underlies ethos. Expertise is similarly rejected because of its “problematic history” of “epistemic violence” against marginalized voices. The enlightenment call to go “back to the sources” for evidence is replaced with the call to “elevate colonized/disabled/noncisconforming/fat/etc voices.”

Like conspiracy theorists, they judge arguments not on their merits, but on the hidden agenda the arguer is assumed to be perpetuating. As Alison Bailey, Director of the Women’s and Gender Studies Program at Illinois State says, “critical pedagogy regards the claims that students make in response to social-justice issues not as propositions to be assessed for their truth value, but as expressions of power that function to re-inscribe and perpetuate social inequalities.” This is called “Privilege-Preserving Epistemic Pushback,” and people of any race are guilty of it if they disagree with critical theory.
(Alison Bailey, “Tracking Privilege-Preserving Epistemic Pushback in Feminist and Critical Race Philosophy Classes,” Hypatica 32, no. 4 (2007), 882.)

Within the critical theory epistemological framework, assertions are no longer about facts or reasoning, they’re about identity. The most important phrase in postmodern rhetoric is “as a.” “As a person of color, as a parent of a disabled person, as a member of the LQBTQIADF community, I am uniquely and exclusively entitled to a point of view on this subject.”

Your identity, of course, gives you your own perspective, but not necessarily your own truth and certainly not your own facts. Even if it were the case that one’s perspective gave them their own truth, it would not follow that their truth is the truth for everyone else, and those outside their perspective but somehow inside their truth can only listen. In critical theory, perspective is a function of narrative, and perspective is the foundation of identity. Because each person’s identity is produced by their perspective, disagreeing – or even failing to actively agree – with their perspective is “denying their personhood.”

These conceptual similarities among the woke explain certain practical similarities that you may have observed in either critical theory or conspiracy theory. For example, when the woke do use evidence, anecdotes are always better than data. Even though black or African Americans are ten times more likely to be killed by someone who shares their skin color than by a white person, activists tell us that they should fear for their lives because of the few videos in which a black suspect is killed by a white cop. Even though repeated epidemiologic studies have not found any association between the MMR vaccination and autism, we’ve all heard that someone we know has a cousin whose kid was diagnosed with autism after receiving a vaccine.

The woke assert a claim to secret knowledge, to have taken the metaphorical “red pill” and to see the invisible power structures of the world and who really controls. It’s like having a claim to magic powers. Yet they often treat those who disagree with them not as merely uninitiated, but as agents of evil. The mainstream cultural power belongs to the critical theory faction, and they are constantly asserting that power against those who somehow commit a thoughtcrime against their worldview. Though Conspiracy Theorists don’t have the same cultural power, they do have a certain influence on those who have them in their audience. Writing this, I know I’ve probably already made a lot of Conspiracy Theorists very angry, and I’m risking accusations of being an agent of Illuminati disinformation. But I hope those who have stuck with me will appreciate my candor in talking about the issue directly rather than patronizingly playing along with ideas I disagree with just to avoid offending a potential audience.

The psychology of hard-core conspiracy theorists is complicated and the psychology of the hard-core critical theory woke is mostly unexplored. Exploring the psychology of the arguer, of course, doesn’t discredit their arguments, but it can be useful in understanding their worldview. Woke theories on both sides allow those who believe them to blame their problems or the complicated issues they see in the world on evil forces like cisheteronormativity, the patriarchy, the Rothschilds, Whiteness, the Illuminati, systemic racism, or the Jews.

Wokeism can act both as a quirk of individual psychology and within a larger community. Communities like this thrive on groupthink and mob psychology, to their adherents constantly fired up. Detecting systemic racism in unlikely spots is a badge of honor for critical theory adherents, just as detecting a conspiracy in ordinary events establishes credibility among conspiracy theorists.

The woke critical theorists also share a feature with conspiracy theorists in that they can seem harmless and goofy most of the time, but have the potential to be dangerous when given power. Power is always dangerous, but responsible people may be humbled by the complexity of the world and difficulty of their job and might exercise some restraint. The woke, however, think that they know how the world works and who they need to destroy to reach utopia. This mentality drove both Adolf Hitler and Pol Pot, as they strove to “free their people” from the people their theories deemed to be oppressors.

Poster from the 1941 “Anti-Masonic” Exhibition in German-occupied Serbia. Approximately 11,000 of the 12,500 Jews in Serbia were murdered during the occupation.

There are, of course, prominent differences between Conspiracy Theory and critical theory. For example, because it developed in plain sight on the internet rather than tucked away in the academy, Conspiracy Theory is spoken about in mostly plain English. This makes it easier to try to talk about, while the strange and nebulous language of critical theory makes it very difficult to identify their circular logic.

If you try to argue logically or with data against the woke, they will typically tell you to “educate yourself” by watching really long conspiracy video or reading a dozen articles on Slate or Salon. The difference is we rarely see celebrities issue groveling apologies to Conspiracy Theorists and assurances that now they have “educated” themselves to the harm their words have done. In terms of culture and popular acceptance in different groups, Conspiracy Theory and critical theory are far away.

So can you be awake without being woke? You can distrust or oppose the government without believing every accusation levied against them just as you can oppose racism without believing every accusation of racism. But that means taking upon yourself the task of skeptically evaluating evidence for yourself. If that sounds exhausting, it is.

In a classical Persian poem, an unjust king asks a holy man, “what worship is greater than prayer?” The holy man says, “for you to remain asleep till the midday, that for this one interval you may not afflict mankind.” (Gulistan, Tale XII). If “wokeness” is to afflict mankind, then it might be better to go back to sleep.



Get more privilege-preserving epistemic pushback and Illuminati disinformation delivered directly to your inbox!

‘Witnesses’ is a Rare Surprise

I’m a notorious Grinch on the subject of religious film.

I have a theory that the standards for religious films are so low because they only tell the audience what they want to hear. The popular God’s Not Dead, for example, is a lousy movie on multiple levels, made to pander to our lowest intellectual tendencies; the part of us that wants a movie to spoonfeed us proof that atheists are all a bunch of idiots. Beyond this main thread, there are about a dozen other plots in God’s Not Dead, a hodgepodge of unrelated ideas and clumsily connected characters, including Duck Dynasty guy for some reason in possibly the movie’s dumbest scene. An ambush reporter who doesn’t talk over her subject but instead allows him to speak while respectfully listening? Give me a break.

I’m mentioning my prejudice to indicate how how surprised I am to say this: Witnesses is a great movie.

Here is a movie that understands that its role is not to be a sermon or a polemic, but an emotional journey. It doesn’t flatter the audience by telling them how right they are, it “discomforts the comfortable,” and takes the audience to doubt and back.

Witnesses is based on the mostly familiar story of Joseph Smith and the Three Witnesses of the Book of Mormon. It’s framed by an interview David Whitmer (Michael Zuccola/Paul Kandarian (older)) gave to a reporter as an old man, as well as an incident back in 1833 where he was ordered at gunpoint by a mob to renounce his witness. Even though Whitmer frames the story, most of the heavy dramatic lifting in the movie is done in Martin Harris (Lincoln Hoppe), in the turmoil leading up to and perpetually following the loss of the first manuscript.

Hoppe gives a unique performance, taking us through his Harris’s desperate conflict to reconsile his powerful doubts with his powerful faith. He has a face that is constantly showing his thought process for us to see; big, formalistic expressions playing on top of each other.

Director Mark Goodman’s style here is moderately expressionistic – emphasizing the emotional perspectives of the characters – but it is also grounded in reality. There are no angelic choirs swelling in the background when the prophet speaks. The camera rarely engages in the pointless “artistic” shots endemic to independent film. The gold plates are not magical glowing relics, they are a solid presence throughout the film. Joseph Smith (Paul Wuthrich) even uses them as an improvised club in an early scene when being chased by thieves. This lack of distracting adornment makes the situation of the members of the Smith family living in close contact and even touching the plates with but never seeing them directly even more emotionally surreal.

A lesser movie would have Joseph giving the audence a speech about why he does not show the plates to the world and Harris should just have faith. Witnesses shows, rather than lectures. Joseph says that he intends to keep his covenant about the plates. Martin Harris makes the same covenant about the 116 pages, and we see in visual and emotional terms how he lets his covenant slip away while Joseph stays firm to his.

We are taken through the difficult journey of the witnesses, which combines the despair of not being able to see the plates with the later despair of disillusionment in Kirtland. Having already seen the angel and plates, they still experienced a crisis of faith, leaving Kirtland and the Church after condemnation from a demagoging Sidney Rigdon (Joseph Carlson).

Two of the Three Witnesses, Martin Harris and Oliver Cowdery (Caleb J. Spivak, who looks uncannily like the real Cowdery), rejoined the Church years layer. David Whitmer never did, though his witness survived both the guns of an angry mob and 50 years of bitterness at the church.

But if you’re expecting a movie to give you proof of how right you are and how dumb those atheists and anoying evangelical billboards are, Witnesses might not be for you. Witnesses shows us the emotional problem of being lost in doubt and points toward the way out. Whitmer says in the end that “The Book of Mormon was not meant to be proven, it was meant to be read, and then asked of to the creator of all.”

If you want the full stories and facts about the witnesses, try to hunt down a copy of Investigating the Book of Mormon Witnesses by Richard Lloyd Anderson, the late BYU historian and my inspiration and friend. Witnesses is dedicated to his memory and – to my surprise – it is a beautiful and fitting tribute.