An Elegy to the Weirdest Dude in American Politics

Earlier this month, Lyndon LaRouche died.

You’ve probably never heard of him. But you should have. He was the best presidential candidate in American history. Or rather, he was the best at being a presidential candidate in American history. He ran for the office eight times, a national record. In every election from 1976 to 2004, he was a candidate. His base was small in numbers but big in enthusiasm, and their support kept him relevant despite never winning a race for any public office.

Remember those posters of Obama with a Hitler Mustache that went viral during the Obamacare debates in 2009? That was him. But don’t confuse him with right-wing lunatics like  Steve Bannon or Steve King. Lyndon LaRouche defied any attempt to fit him into a political box.

His platform changed significantly over the course of his 50-year career in the spotlight, but generally speaking, he and his devoted followers supported investment into nuclear power, a return to a commodity-based monetary system (think the gold standard) and fixed interest rates, defending our way of life against an international conspiracy of the global Aristotelian elite that is led by Her Majesty Queen Elizabeth II to control the world through a combination of terrorism and the drug trade, and changing the pitch we use to tune our instruments to be slightly lower.

It is easy to read that and dismiss him as “what you get when you give a scientologist two pot gummies and lock them in a room with a flat-earther”, or “the most batshit insane conspiracy theorist I have ever seen” or “That Guy”. But it would be an injustice to lump him in with the likes of L. Ron Hubbard and Alex Jones. They don’t hold a candle to him.

Today, I will tell you the tale of Lyndon LaRouche. We will explore his life, his finest achievements and his greatest defeats. We will examine his worldview, and where his bizarre ideas and priorities came from. We will uncover the secrets of his enduring appeal. And we will examine the times where he actually got it right.

Which is more often than you might think.

1. “Sing in me, Muse, and through me tell the story of that man skilled in all ways of contending”

Early Life

Lyndon LaRouche was born in Rochester, New Hampshire, in the year 1922. His parents were Quakers, who forbade him from fighting other children, even in self defense. This consigned him to a neverending torment of bullying throughout his early years. He took to wandering alone through the woods, and threw himself into the comforting grip of books, particularly philosophy. From his 1979 autobiography: “I survived socially by making chiefly Descartes, Leibniz, and Kant my principal peers, looking at myself, my thoughts, my commitments to practice in terms of a kind of collectivity of them constructed in my own mind.”

He was, in short, a geek.

After high school, he attended Northeastern University in Boston, but was “disgusted with the methodological incompetence of most of [his] courses”. He left in 1942.

From there, he wandered through the maze of far-left groups that existed among students at the time. He joined the Socialist Workers Party in 1949, but dismissed them as merely “a custodial staff keeping premises warmed and aired out for the arrival of actual revolutionary leaders”. His contemporaries described him as having an extraordinary breadth of knowledge and “a marvelous ability to place any world happening in a larger context, which seemed to give the event additional meaning”. But the same accounts said his analysis was only skin-deep. His ideas were often contradictory and lacking in detail.

He was also a clear egotist. While a Marxist and Trotskyist by name, LaRouche fixated on their discussion of the elite intellectuals who would join the working class’s revolution. He thought they were talking about him. He believed that he was that philosopher-king who could lead the masses to victory in the US as Lenin had in Russia.

These impulses magnified through the 1960s and early 70s. He took to savagely critiquing his fellow leftists for their past disagreements with him, asserting that history consistently proved him right (note: Lyn Marcus was his pen name at the time). He predicted economic depressions and imminent fascist overthrow and communist revolution. This made him a polarizing figure on the left, with a few ever-more-devoted adherents being counterbalanced by total abandonment from the rest of his audience.

The 1970s: a Leader Emerges

In 1973, he formed the first of his many organizations: the National Caucus of Labor Committees. It’s worth reading over its founding principles for yourself, but I’ve picked out the most important:

12. “Therefore, the political existence of the working class depends upon the intervention of an “outside agency,” whose function it is to bring the political (working) class for itself into being. This “outside agency” can only be a social formation which has already attained an advanced approximation of the working-class consciousness which the working class itself lacks. Only a handful of the capitalist intelligentsia is capable of fulfilling this decisive role, by combining an anti-capitalist political and social orientation with the mastery of history, sociology and economics from the standpoint of the dialectical method.”

17. “While the cadre organization must submit to the class interests of the potential political (working) class for itself, that means and demands insulating the vanguard organization from corrupting intrusions of reactionary (bourgeois) ideology dominant among working people generally, oppressed minorities, and radical students, etc., in a capitalist society. Realization of socialist conceptions means that alien political ideas have ipso facto no voting rights over the formulation of policy within the vanguard organization. It means that the less-developed consciousness of socialist principles must be subordinated to the most-advanced consciousness within the organization.”

In other words: LaRouche and his followers didn’t think an organic labor revolution was in the cards. They believed that the workers of the world needed to be united by an outside force of intellectuals. Only a few special minds would be up to the task. In practice, that meant Lyndon LaRouche and those who agreed with everything Lyndon LaRouche thought.

What’s more, the closing-off of dissenting voices was a foundational idea in the LaRouche movement. For the revolution to succeed, it had to be protected, even from the people it was ostensibly for.

The NCLC quickly took on the trappings of a cult. In 1974, the New York Times wrote of its practices:

“Total commitment is required for members. Jobs and families are expected to be secondary to party work; almost all members who are married also have spouses in the movement.”

It also describes a darker side. The early NCLC was obsessed with brainwashing, and LaRouche himself participated in a “deprogramming” incident with a member named Christopher White. He taped the entire affair, and in it one can hear sounds of weeping, vomiting, pleas for mercy, and LaRouche’s voice saying “raise the voltage”.

He probably selected White because he was British. Britain was a boogeyman to LaRouche, and remained so until his death. He believed that Imperial Britain never truly died, and that it continued to secretly fight to sustain capitalism. He believed that it and its allies controlled vast swathes of the world including the US intelligence agencies. And since he believed that he was the only one who could lead the revolution, they obviously were preoccupied with assassinating him.

Note: interestingly, the FBI really WAS monitoring LaRouche, and some agents even proposed taking steps to help the US Communist party eliminate him. The Bureau has long abused its authority to harass and disrupt both violent and non-violent leftist groups, and attacking LaRouche would not be out of character. I am, however, somewhat more skeptical that their tactics included mind-control.

By the mid-70s, LaRouche had abandoned any pretense of alliance with other leftist groups. Shortly after its formation, the NCLC would begin “Operation Mop-Up”: packs of LaRouchies would roam the streets of New York, beating to a pulp any members of rival leftist groups they found. One harrowing account was printed in the Village Voice.

He also began to reach out to a different kind of extremist groups: far-right fascists. He abandoned Marx (note: if you value your non-aching head, do not try to read that), became a vicious anti-environmentalist, and made overtures to groups like the KKK. This led many outside the movement to say he became a far-right fascist, though he continued to find allies at the fringes of both the left and right for the rest of his career.

Before we leave this era, I should mention that it saw his first Presidential campaign, in 1976. His platform predicted the apocalypse in less than 2 decades if he did not win. It also featured a paid half-hour address on prime-time T.V., which would become a mainstay of his candidacies.

He received just over 40,000 votes nationwide.

The 1980s: Pride and a Fall

LaRouche reached the height of his power in the Reagan administration. His organization moved from New York to a mansion in the sleepy town of Leesburg, Virginia. They turned it into a fortified compound, guarded by camouflaged devotees armed with guns. They harassed the locals, accusing the local garden club of being a Russian PsyOp and forcing one lawyer to abandon the town.

In his greatest electoral achievement at the state and federal level, LaRouche-affiliated candidates managed to win the democratic primaries for both Lieutenant Governor and Secretary of State in Illinois. The candidate for Governor declined to run alongside the LaRouchies and switched to a third-party bid. The republican ultimately won the race.

At the same time, he began to spread the most pernicious lie of his career: that AIDS wasn’t sexually transmitted, but could be caught from just a cough or a touch, and that victims should be quarantined for the good of the public. He put a proposition on the ballot in California to ban AIDS patients from holding jobs and their children from attending schools. While it was defeated, it got over a million votes. It seemed only a matter of time before his movement would win one of these races.

This success would not last. The International Plot had finally found his weakness: the money.

Throughout the 1980s, the FBI had quietly been investigating LaRouche and his organization for a host of alleged financial crimes, including widespread credit-card fraud. As the decade wore on, the list of active investigators expanded to include both the IRS and the Federal Election Commission.

In 1986, the FBI raided LaRouche’s offices in both Virginia and Massachusetts. They found extensive evidence of a host of financial crimes, enough for a Boston grand jury to indict LaRouche and the leadership of his movement. That jury, led by a young US Attorney named Robert Mueller, alleged that LaRouche’s organization had committed over 2,000 cases of credit card fraud, and made extensive efforts to obstruct the investigation and destroy evidence. The defense lawyers described (accurately) an extensive campaign of harassment against the organization by the FBI. The trial began in early 1987, and dragged on well into 1988. In the end, the government was forced to declare a mistrial: the case had gone for so long, and enough jurors had been excused for having to return to work, that they could no longer maintain a jury.

The case then moved to Virginia, where it progressed more quickly. It also took on more charges: for some reason, the government felt it was strange that a man who lived in a fortified mansion hadn’t filed a tax return in a decade.

LaRouche was ultimately convicted on fourteen different counts of varying forms of conspiracy and fraud, and sentenced to 15 years in prison.

The Long Goodbye

Incarceration did not stop Lyndon LaRouche. He received daily intelligence reports from his organization while in prison, and even ran for president from a jail cell in 1992. He also shared a cell with the infamous televangelist Jim Bakker, who later remarked that “to say LaRouche was a little paranoid would be like saying that the Titanic had a little leak.”

LaRouche was released early in 1994. But it was never quite the same. His organization had been hollowed out by the investigation, and what was left had atrophied without his peculiar charisma.

At first, he threw his resources behind a campaign to exonerate him, but it sadly failed. With that failure, he turned towards 9/11 conspiracy theories and other, almost run-of-the-mill fringe beliefs like global warming denial.

He also began to suffer a clear mental decline. His writings had never been the clearest, but they grew more and more bizarre, almost nonsensical. They lurched uneasily from discourse on sense-perception to asides on Truman’s presidency, with little rhyme or reason.

There are three remaining events of import in his story.

The first was his accurate prediction of the 2008 financial crash. It’s a real achievement on his part, albeit watered down by the fact that he had been predicting the crash was imminent for several decades. Better late than never.

Second, he was a pioneer of the most pernicious smears of the Obamacare debates. I mentioned the Hitler-mustache posters earlier, but he was also an early adopter of the death-panels myth, referring to the healthcare bill as “genocide” months before the republicans caught on. It would later become Politifact’s “Lie of the Year”.

Finally, his last political act was to throw his support behind Donald Trump’s campaign. Because of course he did.

LaRouche died on February 12 this year. He was 96. He leaves behind a tight-knit group of several thousand devotees, many of whom have stuck with him since he first founded the NCLC.

For his entire career, LaRouche was an enigma to those of us on the outside. His bizarre mix of issues and theories seemed to have no rhyme or reason whatsoever. But there was a method to the madness, and a reason why so many people stayed so loyal to him for so long.

2. “The Good therefore may be said to be the source not only of the intelligibility of the objects of knowledge, but also of their existence and reality”

In order to grok Lyndon LaRouche, you first have to grok his worldview. And for that, we have to go back all the way to ancient Greece, and the discourse of Plato and Aristotle. I am not going to explain every aspect of the two philosopher’s beliefs, but the most pertinent debate covered the obscure and minor subject known as “the nature of reality”.

Plato believed in an idea of the “forms”: roughly, reality, ugly, imperfect, and chaotic, was a reflection of a set of greater Ideas. The chair I’m sitting in is just an imperfect realization of the Idea of “chair”. He described our struggle to understand and contemplate those forms in the famous “Allegory of the Cave” that you probably read in high school.

Aristotle took a more naturalistic approach. He did not believe that there was some perfect representation of “chair” out there for us to understand. He saw these as unproductive thought experiments. He preferred to ground himself in observations of the natural world around him.

Lyndon LaRouche has turned that dispute into the driving force of all of human history since. To him, every intellectual in every discipline has, without realizing it, been a follower of Plato or Aristotle. The pursuers of the Ideal, and the Church of Sense Perception. Beethoven, Shakespeare, and Kepler are Platonists. Kant, Locke, and Hobbes are Aristotelians.

He also takes a side. The Platonists are right. The Aristotelian tradition of pure naturalism has led us astray. He blames the failed leadership of the Aristotelians (which are usually oligarchs) for most, if not all, of the world’s ills. To him, the intellectual descendants of this Greek philosopher powerful and power-hungry, and we must stop them at any cost.

And Lyndon LaRouche is the only man who can do it.

Many have tried to label LaRouche as either a “Marxist” or a “Fascist”. But they never fully fit. He is at times one, and then the other. His political ideology does not match any other movement in our history, because philosophically, he acts on a different axis. They, like the poor souls in Plato’s cave, see only the shadows his worldview casts on the wall. The light flickers, and the shadow changes from left to right and back again, but the object stays the same.

Likewise, the many accusations of anti-semitism, racism, and homophobia fail to land. While he may have been all three, and he certainly used racist, homophobic, and anti-semitic language and pursued hateful policies against each, such concerns were tangential to him. It’s the reason why he managed to lead a movement with so many jews it alienated the KKK while authoring pieces titled “My View on the Jewish Question”. He didn’t use pseudo-philosophical conspiracy theories as coded language for antisemitic beliefs. He used antisemitic rhetoric as coded language for his pseudo-philosophical conspiracy theories.

Once you accept the core premise of Plato vs. Aristotle, you begin a descent down a truly bottomless rabbit-hole of paranoia and pseudo-history, pseudo-philosophy, pseudoscience, really all the pseudo’s you can muster. Kepler’s theories of planetary orbits become a real-life manifestation of the platonic forms. They, in turn, are a metaphor for the discrete stages of human progress (an idea originally taken from Marx). The deeper you go, the more his bizarre panoply of policies make sense. Take his opposition to environmental regulation and support of nuclear power. If you see the purpose of humanity as unlocking these higher spheres of progress, like expanding out through Kepler orbits, then anything that obstructs our progress is an evil thing. We can’t afford to waste time caring about our impact on the world around us if it would arrest our forward motion intellectually. And we’ll need a lot of energy to fuel our philosophical explorations as more people devote their time to the arts…

If it doesn’t make perfect sense to you, don’t worry. I’d be concerned if it did.

 

3. “All Men by Nature Desire to Know”

It is easy for those of us on the outside to dismiss LaRouche’s ideas as the worst sort of crypto-fascist nonsense and his followers as misguided morons. We all want to think that there is no way we’d ever fall for such blatant malarkey. But the science of cults tells us that belief is as much a fantasy as anything that came out of Lyndon LaRouche’s mouth. If you don’t believe me, I suggest you check out this Ted Talk from a former member of a different cult on how they lured her in, though I should warn you, it has some awfully disturbing imagery.

At its core, the movement’s appeal comes from our own search for meaning. It begins by targeting people who are already struggling with that question: the young and idealistic, the lonely and isolated, and the grieving. Essentially, people who feel in some way unmoored in our large and uncertain world.

LaRouche and his followers take these vulnerable people and give them an anchor. Where the rest of us offer chaos and confusion, LaRouche offers clarity and confidence. He tells you that the world isn’t all that complicated after all. He tells you that the truth is out there, and he can help you find it. Most importantly, he tells you that you are special. That you have a purpose. You were meant to fight in this titanic struggle between good and evil. Only you can learn how the world really works. And your job is to help spread that knowledge to others. All you have to do is listen to the man on the brochures.

The group’s pitch is both personal and interpersonal. Movements like LaRouche’s offer a strong sense of social comradeship. The other members are your tribe now, and they look out for their own. New recruits are encouraged to abandon their old ties to friends and family, replacing them with bonds to fellow LaRouchies. When they do so, they provide the same social validation to the next wave of recruits.

Then there is the initiation process. The other members purge your mind of its preconceptions that might interfere with the movement’s goals, in order to ensure your loyalty. Often, this gets physical (“Raise the voltage”), but it is certainly not always. It can be as tame as attending a rally under the right conditions. Outsiders call this process “brainwashing” and “programming”. Larouche described it as bringing out the recognition that “one’s self as presented to the world is not ‘the real me,’ not the ‘soul.'” I use a different term: hazing. We try to exoticize the process by calling it “brainwashing” to convince us that we wouldn’t fall for it, but the truth is that it is not so different from your average fraternity initiation or elaborate team-building exercise. The operate on the same basic principle: validating your sense of belonging to the group through shared suffering or sacrifice.

Whatever our individual differences, all humans are united by a few fundamental desires. We all want to feel like we belong, like we have some identity to be proud of. We all want a sense of purpose and fulfillment in our daily lives. We all want to feel right, like we are a good person, and like we know what to do next. Ideologies like LaRouche’s are a one-stop shop for all of these basic needs. They give you that identity. They give you that purpose. They give you the certainty that you are right, because LaRouche is right, and you’re with him.

Usually, when non-members talk to true believers, we fixate on the contradictory evidence the true believers rationalize away or ignore. We look on in disbelief as they dismiss overwhelming scientific consensus, the apocalyptic predictions that didn’t happen, even proven criminal activity as “fake news” and “propaganda”. But that behavior is only an exaggerated version of impulses we all share. When we are presented with information that contradicts our worldview or identity, we find a way to disregard it. LaRouche and his adherents are just more audacious in their confirmation bias than we are.

This appeal can work on anyone. Intelligence and education are no shield: the first members of LaRouche’s movement were Columbia students. Rich and poor, black and white, men and women, we all share the same cognitive biases and we all are vulnerable to the undeniable pull of a movement that has all the answers.

 

4. “I much prefer the sharpest criticism of a single intelligent man to the thoughtless approval of the masses”

A funny thing happened while I was researching this post: frequently, far more frequently than I ever expected, I found myself agreeing with Lyndon LaRouche.

Sure, he said some crazy stuff. I have barely scraped the surface of the lunacy he espoused over the course of his career. You could fill a book with all the bizarre, absurd, and incomprehensible theories the man has come up with. Many people have, including LaRouche himself.

But the great tragedy of Lyndon LaRouche, what sets him apart from Alex Jones and Louis Farrakhan and the other also-rans of our political fringe, is just how often he actually got it right.

Take a line like this: “automation not only wipes out jobs, it wipes out the need for old-style, repetitive factory labor. In place of production workers, we will need an equal or greater number of engineers and scientists. Our whole educational system will be hopelessly outdated by these changes in the means of production. Educational changes must be made so that we may have the skills we need.”

Today, that sounds like an almost trite observation of our post-industrial economy. Automation and the collapse of manufacturing jobs across this country helped elect Donald Trump. It is a fundamental force shaping our entire culture. And education and re-education are contributing factors: we have far too many unemployed steel-workers, far too few software engineers, and no good way to convert one into the other. Right now, that passage is uncontroversial.

Lyndon LaRouche wrote that in 1954.

Even his core delusion, the Aristotelian conspiracy, is rooted in a real philosophical dispute. Aristotle and Plato really did disagree. And while I’m unconvinced that their dispute is the driving force of history, when LaRouche starts talking about objectivity and its limitations, he makes an uncomfortable amount of sense:

“In reality, what we call “modern science” is a highly subjective business. People who run around talking about “objective science” really show that they don’t know much about the history of science.”

That statement is true. History has shown us that, while the scientific method may be immune to bias, its practitioners are not. If science and its practitioners really were objective, it wouldn’t advance one funeral at a time.

Note: in the section after that quote, he says a cabal of gay British Aristotelians are concealing the evidence that cold fusion is real and also that rock music is a Satanic plot. I didn’t say the man was perfect.

Every account by outsiders who knew LaRouche describes astonishment at the breadth of his intelligence. He was well-read, sharp-witted, and at times downright insightful. I don’t know whether his paranoia was built into his genes or if some therapy and the right environment of fellow thinkers might have tamed his worst impulses. But I do know that buried within the lunatic was a true visionary.

5. “There is a strange charm in the thoughts of a good legacy”

History will not be kind to Lyndon LaRouche. His lies have done too much damage for him to be more than a figure of scorn, if he is remembered at all. His legacy will be the violence of his followers, his misinformation campaigns about AIDS and Obamacare, and his climate change denialism. That is a good thing. The pain they have caused has already outlived him, and will continue to haunt us for decades to come.

And yet, it feels wrong to celebrate his departure, to reduce him to his greatest crimes. He was more than just another cultist or con-artist hovering at the fringes of our politics. He was a living contradiction, a proof that our understanding of our own culture isn’t as concrete as we’d like to believe. He made allies of black nationalists and the Ku Klux Klan. He was completely insane, but his brainwashed followers are probably better-versed in the classics than you are. He was a politician, a philosopher, a cult leader, a communist, a fascist, a felon, even (horrifyingly) a poet. He was, by a significant margin, the weirdest person in American politics.

Lyndon LaRouche died at the age of 96. For the first time in a century, we are living in a world without him. It’s a different world from where we lived before. It may be better off for his absence. It probably is. But whatever else, it is certainly less interesting.

The Wild and Wacky World of Christian Media

When I was in high school, I was browsing through the sci-fi section of a local book fair when I came across two books that sounded right up my alley. I still remember the first sentence of the blurb: “In one cataclysmic moment, millions around the globe disappear.”

The series was called “Left Behind”.

I didn’t make it past the first hundred pages. The authors had this weird need to add every character’s relation to religion into their introduction, and it bothered me. That was fine. My bookshelves are packed with mediocre speculative fiction that I’ve opened only to close a half an hour later.

It was only when I mentioned them to my parents that I learned that these weren’t just badly written sci-fi. The Left Behind series was the evangelical christian equivalent of Harry Potter, a massively bestselling book series telling a story that was as much prediction as fiction. I gave them a quick google, went “huh, interesting”, and moved on with my life.

I didn’t think about the Left Behind series again until 2014, when YouTube recommended to me that I watch the trailer for… Left Behind, the movie? Starring NICOLAS CAGE???

What followed was a years-long dive down the rabbit hole that is the Christian Entertainment Industry, culminating in the post you’re reading today.

A few minutes after I watched that trailer, I learned that not only was this movie not a fever dream spawned by an unholy combination of final exams, too much Fox News, and an ill-advised drunken viewing of The Wicker Man, it wasn’t even the first Left Behind adaptation on film. There was a trilogy from the early 2000s, covering the first three books of the sixteen-part series. They had been created by a company called Cloud Ten Pictures, which, according to Wikipedia, specialized in Christian end-of-the-world films.

But for a company to specialize something implies that there are other companies that don’t. This in turn, led me to the many, many corporations over the years dedicated to creating literature, music, and of course, film, exclusively by and for the Evangelical Christian movement.

I want to begin by clarifying that I have not seen most of the movies these studios put out. There are hundreds, perhaps thousands of them, and especially with the oldest ones they can be pretty hard to find. But I have seen most of the ones which are on YouTube, which it turns out is a lot of them. And I see three eras, as distinct in tone and style as the talkies were from silent pictures are from the technicolor blockbuster.

1951-1996: Prehistory

Like most movements, you can pick almost any date you like for when the Christian Media Boom began, but I place its foundations on October 2nd, 1951. That’s when evangelical minister Billy Graham’s brand new production company, World Wide Pictures, released Mr. Texas, which Wikipedia tells me was the first Christian Western film in history. I wasn’t able to find a copy of Mr. Texas, but I did find a copy of the second Christian Western ever made, a flick called Oiltown, USA. It’s a classic romp about a greedy atheist oil baron who is guided to The Light by a good-hearted family man named Jim and a cameo by Billy Graham himself. It was made in 1953, and it shows, right down to the black housekeeper whose dialect is two steps removed from “massa Manning”. It takes a ten minute break near the end to show us an unabridged Graham sermon, in its entirety. And I have to tell you, that on its own makes the whole 71 minutes worth watching. He delivers his soliloquy with what I can only describe as a charismatic monotone. The same furious, indignant schoolmaster chastises me with “When Jesus Christ was hanging on the cross, I want you to see him, I want to see the nails in his hands, I want you to see the spike through his feet, I want you to see the crown of thorns upon his brow!” and angrily comforts me that “Christ forgives and forgets our past!” It’s the funniest thing I’ve seen in ages, and yet it’s so earnest, you almost can’t help but be moved.

There’s actually a lot of room for thematic discourse on Oiltown, despite it having fallen so deep into obscurity it’s got a total of one review on IMDb. Halfway through, there is a scene where Les, the oil baron, pulls a gun on one of his employees in a scuffle, before being disarmed by Jim, the Right Proper Christian of the show. The gun, which since “The Maltese Falcon” has been used as a stand-in for the male phallus as a symbol of masculine power that won’t get an NC-17 rating, is first brandished by Les. He is disarmed, or metaphorically castrated, by Jim. Jim engages in christian charity and returns Les’s manhood to him by placing it on the desk. Our Christ stand-in both gives and takes away power, in equal measure. “Christ forgives and forgets our past”, after all.

There’s also a weirdly anti-capitalist message buried in here, though it’s mostly post-textual. While it’s implied that Les is an adulterer and he clearly is capable of murder, the main sin we see in him is greed, manifested by a corporate desire to make money at the expense of a soul. “Here’s my god, two strong hands, a mind to direct them, and a few strong backs to do the dirty work!” he proclaims to Jim when they first meet. It’s unlikely a less christian movie could have gotten away with such progressive messaging in 1953, the same year Joe McCarthy got the chairmanship of the Senate Permanent Subcommittee on Investigations. But I digress.

More importantly, it sets up a few themes we will continue to see in christian film later on. The first, and most glaringly obvious, is the anvilicious message of “christians good, non-christians bad”. The only explicitly non-christian character in the movie, Les, is an adulterous cash-obsessed robber baron who gambles and blackmails and nearly commits murder. Contrast that with Jim and Katherine, the golden couple who can do no wrong, and you  see a deeply manichean worldview: for all Billy Graham’s talk that “there is not enough goodness you can do to save yourself”, in his world, good works are the unique domain of the christian people.

Second, you should note that even back then, it’s not enough for Les to simply be an evil man. He is also actively hostile to the church. He sees it as nonsense that he must protect his daughter from at any cost, and says as much to Jim. In the context of the story, that makes sense. This is a Beauty and the Beast tale, with Les as the Beast and Billy Graham as a strangely-cast Belle. For his arc to be how he becomes a good person through Christ, he has to begin at a point of maximal disbelief. But it leaves no room for the quiet disbelief or genuine faith in other religions embodied by most non-christians and even christian non-evangelicals. Sadly, that’s the beginning of a long trend.

World Wide Pictures continued to pump out movies for decades after those first hits. Most of them are rehashes of the same ground broken by Oiltown, USA: a man goes through some rough ordeals and is ultimately saved by God and Graham.

 

Billy Graham’s outlet stayed the same, but the world around him changed. His brand of evangelicalism was strictly apolitical, and he had cozy relationships with democratic and republican presidents alike. He happily fought for anti-sodomy laws and a swift response to the AIDS crisis in equal measure. His brand was driven by gospel, not republicanism.

That strain of evangelical christianity died with Roe v. Wade, and the cultural upheavals of the 1960s and 70s. The dominance of traditional christian values had been under threat in America for years, but now, we had legalized what in the eyes of the church could only ever be baby murder. The response was a more political religious tradition, and the birth of modern Christian fundamentalism.

This tradition, while still hailing Graham as a forefather, owes just as much to Barry Goldwater and Ronald Reagan for its existence. It believed, and believes to this day, in young- or old-Earth creationism as a valid scientific theory that should be taught alongside evolution in public schools. It rejects the notion of the separation of church and state, arguing that the universally-christian founding fathers formed this country as a christian nation and discounting the words of Thomas Jefferson and the first amendment alike. It rejects the liberal attitudes toward sex that had become popular in the 60s, and perceives promiscuity, homosexuality, and “sexual deviancy” writ large as threats to the very fabric of society. And that’s just the tip of the iceberg.

As of 2017, 80% of evangelical christians subscribe to a religious theory called dispensationalism. Among its tenets is the belief that the second coming of Jesus Christ is tied to the establishment of the state of Israel, and that jews, all jews, must live there before Christ can return to end the world and sort all remaining souls into heaven or hell. What’s more, they believe that it is their duty to bring about that outcome as quickly as possible: they wish to be vindicated in their beliefs. This manifests in rabid support for Israel, and particularly for the Israeli right-wing and its West Bank settlers. Don’t mistake that for support for the Jewish people, however. More on that later.

It also left many of its members open to the worst impulses of the American right. There has always been a paranoid streak in our society, all the way back to the Alien and Sedition acts . Even Billy Graham himself railed against the creeping influence of communism. Under the leadership of his son, Franklin, and the former segregationist/Moral Majority founder Jerry Falwell, those impulses were magnified into widespread belief in nationalist, even white nationalist conspiracy theories. Many evangelicals genuinely fear the dawn of a New World Order, seeing the seeds of it in the globalist economy and the formation of the UN. They see the influence of “international bankers”, the creation of shared currencies like the euro, even the Catholic church as vectors for the Antichrist’s eventual rise to power.

 

The prehistoric era of the christian film industry ended in 1994, through the actions of a man named Patrick Matrisciana. Sixteen years before, he had created a christian production company called “Jeremiah Films”, which is still around today. As I’m writing this, its home page prominently displays a trailer for a “documentary” claiming that the Antichrist may be either an alien or a sentient artificial intelligence.

I will also note that its homepage redirects to its store.

Whatever diminished form it may take today, in 1994 the company, or at least its founder, had a far wider influence. With distribution help from Jerry Falwell, Matrisciana produced the documentary The Clinton Chronicles, ostensibly through a group called “Citizens for Honest Government”.

The film is available in its entirety on YouTube, complete with an opening informing us that “unauthorized duplication is a violation of federal law”. Beyond the standard and largely credible accusation of philandering on the part of Bill Clinton, it also claims that he and Hillary engaged in a widespread conspiracy to traffic cocaine through Mena Airport in Arkansas, partook in it themselves, sexually abused young girls, and murdered anyone who threatened to expose them. It relied heavily on the testimonials of a man named Larry Nichols, a disgruntled former Clinton employee who had been fired for malfeasance in 1987, when Bill was still governor of Arkansas. It also paid over $200,000 to various individuals to support his claims. While Matrisciani denied that Falwell had anything to do with the payments, that only leads me to wonder where the money came from instead.

According to the New York Times, about 300,000 copies of The Clinton Chronicles made it into circulation. The general audience for the conspiracies it introduced to the world numbered in the tens of millions.

At this point, the christian film industry had crossed the event horizon. The Clinton Chronicles was an undeniably secular, completely political film, paid for by church dollars with the intent to take down an opponent of the moral majority. And it had gone viral in ways that no prior production from the industry had, all the way back to the days of Oiltown and Mr. Texas. Not only had their message reached a far wider audience by humoring their most paranoid impulses, they had uncovered a way to make large sums of money doing so. And so, at long last, we reach the films that introduced me to this world of mysteries.

1996-2007:  The Left Behind Era

The last years of the 20th century were a perfect breeding ground for paranoid conspiracies. The world was undergoing radical social change at a previously unimaginable pace, but more importantly it was uniting. This was the age of Fukuyama’s “The End of History”, when liberal democracy seemed the future across the globe. We saw the formal creation of the EU in 1992 and the Euro in 1999, saw phrases like “made in China” go from obscurity to ubiquity, saw the dawn of the internet in the public consciousness. And with the collapse of the Soviet Union, there was no longer an easy villain on which to foist our cultural ills.

With such societal upheavals comes a natural fear of the changes they bring. The same impulses which today fuel the nationalism of Donald Trump and populists across the globe stoked the fears of the Christian right in 1996. The idea of consolidating the whole of humanity under one financial and political roof meant no more America, the Christian nation. While I doubt many in the Christian right thought in those terms, there was very much an undercurrent of worry that one day you might wake up and not recognize your own country anymore.

Combine this with a religious fascination with the end of the world, and the already-established belief that this end, with its all-controlling Antichrist, was nigh, and you have a recipe for not just Clinton bashing but full-fledged devotion to conspiracy.

These conspiracies defined the second great epoch of the Christian Media Industry. Like the campy sermons of Billy Graham’s heyday, it centered around a single figure: The Left Behind series.

 

Left Behind started in 1995 as a book series by the minister/writer duo of the late Tim LaHaye and Jerry B. Jenkins. There are twelve books in the main series, plus three prequels, a sequel depicting life after the Second Coming, a spinoff series of children’s books, a Real-Time Strategy game a la Starcraft which gives you the option to play as the Antichrist but not to train women in roles other than nurse and singer, and of course, four movies.

Today, it has sold 65 million copies, more than Little House on the Prairie.

The plot is an exercise in Eschatology, the study of biblical accounts of the end of the world. The rapture happens in the first book, taking all true christians and all children from the earth. The remaining word count is devoted to the titular folks who were left behind as they struggle through the seven years of tribulation before the Second Coming of Christ. The Antichrist is Nicolae Carpathia, originally the Romanian President but who, with the initial backing of two “international bankers” soon becomes the UN Secretary-General and later the leader of a group called the Global Community. The Global Community is a totalitarian regime only one or two steps removed from the most fanatical ideas of the New World Order. It has a monopoly on the media through its news network, GNN, it kills anyone that gets in its way, and most importantly, it establishes a new global religion to supercede all others, including Christianity. At first, this religion is called “Enigma Babylon One World Faith”. No, I am not making that name up. Later on, when Carpathia is assassinated only to be resurrected and possessed by Satan himself, this is replaced by “Carpathianism”.

Earlier, I compared these novels to the Harry Potter series for the religious right, and while that comparison is apt in terms of magnitude, it is not accurate in terms of what it means to its audience. Harry Potter was nothing more or less than a work of fiction. It was well-written, easily accessible escapism.

To the religious right, Left Behind is more like The Hunger Games: a stylized representation of our own anxieties about the future. I doubt that any evangelicals believe that the antichrist will be named Nicolae Carpathia, or that a pilot named Rayford Steele and a journalist named Buck Williams will be the keys to our salvation. But as of 2013, 77% of evangelicals in the US believe we are living in the end times right now. For much of Left Behind‘s multimillion-strong audience, this is not theoretical.

I have not read much of the Left Behind series. My knowledge of its text and subtext is largely limited to the extensive and lovingly-written wikipedia summaries of each book. This is enough to tell me such critical details as the type of car Nicolae and the false prophet, Leon Fortunato, use to flee Christ when he finally returns in Glorious Appearing, or the fate of satanist lackey Viv Ivins (Get it? VIV IVIns, VI VI VI, 666?), but not enough for me to feel comfortable discussing their thematic undertones.

I have, however, seen the films. Created in the early 2000s by Cloud Ten Pictures, they weren’t even the first foray into christian armageddon by the studio. They were preceded by the four-part Apocalypse series, which covered the same ground but without the Left Behind label. Note that the first three Left Behind novels had already been published before Apocalypse: Caught in the Eye of the Storm was released, so these movies likely owe their existence as much to a rights dispute as to the artistic vision of its creators. As for the movies themselves, I cannot do the experience of watching them justice in words. You have to see this shit for yourself.

By 2000, however, Cloud Ten had worked out the kinks and were ready to move forward with their film. It, and both of its sequels, are currently available on YouTube. If you have the time, I encourage you to watch them. I don’t know how it feels to be a born-again christian watching them, but for someone outside the tribe, they are the epitome of “so bad, it’s good”.

That is, until you remember that millions of people out there are dead serious in their appreciation for this story. It departs the realm of the realistic only a few minutes in, when it portrays an immense squadron of what appear to be F-15 eagles invading Israeli airspace and attempting to carpet-bomb Jerusalem.

Whatever your opinions may be on Israel, the threats to its existence don’t take the form of American-made fighter jets flattening cities. Yet the reverent tone in Chaim Rosenzweig when he says “no one has more enemies than Israel” tells us that this sequence is not an over-the-top farce or spoof. This movie genuinely wants us to believe that this is what life is in Israel.

In three movies and 270 minutes of screen time, there is not one mention of Palestine or the Palestinian people. There are only passing references to what “The Arabs” might allow with regard to the Temple Mount.

Earlier, I cautioned you not to confuse the evangelical movement’s support for Israel with support for Jews. By the second movie, you begin to see why. The climax of the film centers around an announcement by rabbi Tsion Ben-Judah in Jerusalem, “the world’s most knowledgable and respected religious scholar”. This announcement is ultimately revealed to be that he now recognizes Jesus Christ as the messiah, in light of overwhelming biblical evidence. This image, of a renowned Jewish scholar renouncing his faith to support the ideals of born-again Christianity, is how LaHaye and Jenkins see the Jewish people: Christians in all but name, only a few short steps from embracing the True Light.

To be fair, in that regard the Left Behind series is an equal-opportunity offender. The dichotomy of Christians good, non-Christians bad first seen in Oiltown has reached its logical extreme here. Every person who is remotely reasonable or kind is either a born-again Christian or becomes one by the end of the series. The general dickishness of Les Manning has now become lip service to the literal Antichrist.

Perhaps the best evidence for this worldview is the depiction of the events immediately after the Rapture: as though freed by the absence of Evangelicals to release our inner beasts, those left behind immediately turn to violence and looting. A main character’s car gets stolen off the freeway (what happened to this looter’s car? did he just leave it on the interstate?), Martial law is declared within 12 hours, the world falls into chaos to an almost comical degree. Some of this is real, in that if hundreds of millions of people disappeared in an instant there would be vast upheavals, but humanity’s experience with natural disasters has told us that there are usually as many good samaritans as there are opportunists.

Not so in Left Behind. Recall what Billy Graham said 47 years before: “there is not enough goodness you can do to save yourself”. The good samaritans were all raptured, because all the good samaritans were Christian.

These films, like The Clinton Chronicles, openly embrace the paranoid impulses of their audience. The Antichrist rises to power through the influence of the global financial elite, manipulating and contorting the UN into a vehicle for his ambitions. This is explicitly tied to the drive to unify the world under one government, one language, one currency, and one culture. When Carpathia reveals his plan for world domination in the final moments of the film, the world map he uses to display it is not the Mercator projection that had been used in public schools for a century. It didn’t use the newer, Robinson projection that the National Geographic Society would happily have given them for free.

Instead, viewers are greeted with the obscure and rarely-used Azimuthal equidistant projection, which uses the north pole as the center and distorts the southern hemisphere to the point that Antarctica looks like a JPEG artifact. It has neither fidelity of shape nor area, and to the best of my knowledge has only one common use:

It’s the flag of the United Nations.

Screen Shot 2018-12-30 at 8.24.57 PM

UN Flag

With all that said, it’s also undeniable that the Left Behind movies did not have the same pull as that of the books. The first film barely made back its budget, and the sequels didn’t even get a theatrical release. In fact, the production quality was so bad Tim LaHaye sued Cloud Ten Pictures claiming breach of contract. Left Behind 3: World at War came out in 2005, the last Left Behind Novel came out in 2007, and after that the Christian right largely moved on. The google trends data for the series is a slow, petering death.

 

The Dark Ages: 2007-2014

After Left Behind, the Christian Media Industry underwent its own 7 year tribulation. The circumstances of the world around them remained ideal for this conspiratorial mindset. If anything, they were improving. The 2008 recession had taken railing against international banks from the realm of antisemitic claptrap into the mainstream, as Americans watched their tax dollars funneled into wall street bailouts. And the christian right now had a black, democratic president who had spent time in a muslim country in his youth. And indeed, the birther hoax, along with the less common but still widespread belief that Obama himself was the Antichrist, emerged at this time.

Even these environmental shifts, however, could not overwhelm the pressure of demographics. My generation is perhaps the least religious in American history, and as the first millenials entered adulthood, evangelical membership began to plummet.

It wasn’t just a matter of religion, though: the Christian right had built its rhetoric around certain fundamental issues, most notably opposition to gay marriage. That made sense in 2001, when that view was shared by 57% of the electorate, but that majority was dissolving rapidly and with no end in sight. Their issue profile didn’t just not appeal to the future of America, it actively turned them away.

In 2011, 87% of evangelical ministers felt their movement was losing ground. It was the highest number from any church leaders in any country.

The Christian right needed to rebrand, and fast. It needed to completely overhaul its message in a way that spoke the language of Millenials with its emphasis on equal treatment of minority groups, without alienating its traditional congregation who still cared deeply about traditional values issues like abortion and gay rights.

From that effort came the greatest triumph of the entire industry, a film that catapulted it out of irrelevance and into the national spotlight. It has been the elephant in the room since I began this piece. It’s time to discuss God’s Not Dead.

 

The Second Coming: 2014-Present

God’s Not Dead exploded onto the national scene like a bomb. While relatively small by the standards of mainstream Hollywood, its nearly $65 million box office take makes it the most successful film out of the Christian film industry by a factor of two. It earned more in its opening weekend than the gross of all Left Behind movies made up to that point combined. It even came out seven months before a Left Behind remake, the Nicolas Cage vehicle that was meant to bring more professionalism to the series. Its box office take ended up at less than half that of God’s Not Dead, despite opening at more than twice as many theaters. This was the new face of Christian Media.

Especially in comparison to the apocalyptic bombast of their predecessors, the God’s Not Dead series is deeply personal. Each movie is, at its heart, a courtroom drama, depicting a case in which God is, metaphorically, on trial. In the first, it’s a university classroom where a professor is forcing a student to sign a pledge that God is dead. In the second, it’s a literal court case, concerning alleged misconduct by a teacher bringing Jesus into the classroom. In the third… We’ll get to that later.

This innovation makes them vastly superior vehicles to their predecessors. I have never been forced to sign a pledge that God is Dead to pass a college course, nor has any student in American history, but we have all encountered a pretentious professor with a political axe to grind. They appeal to certain universal experiences which are inherently more relatable than The Antichrist.

That is not to say that these movies are good. The first one, for instance, tries to go for a Love, Actually-style story featuring several disconnected narratives that are ultimately united by a common theme, but their choice of narratives to tell dooms it from the start. Love, Actually succeeded because each story was roughly balanced in terms of the stakes involved, and there really was no “main plot”. Neither is true in these movies, and the result is that you spend a lot of your viewing time wondering why you’re watching the travails of these two preachers trying to rent a car to go to Disney World and when you’ll get back to the interesting part in the classroom.

Thematically, however, they are fascinating in ways that none of their predecessors are, because these don’t just show a heroic struggle of good vs. evil or serve as vehicles for a sermon: they show atheists making their case, as the Evangelical Christian movement hears it. It requires a more detailed analysis, so we’re going to discuss them in detail, as individual movies. Well, the first two at least. The third was a box office flop and isn’t even available on PureFlix’s website anymore.

God’s Not Dead (2014):

 

Just to make sure we’re all on the same page, here’s a brief summary: at an unnamed university, freshman and devout Christian Josh Wheaton signs up for a philosophy class led by the militant atheist, Professor Radisson. Radisson demands all his students begin the class by signing a pledge that God is Dead. Josh refuses, Radisson tells him that if he doesn’t sign he will have to prove God exists to the class, and Josh agrees. There are several other subplots, most of which are there to show how horrible non-christians are, including:

  • Amy, a liberal gotcha-journalist with an “I <3 Evolution” bumper sticker on her car, who gets cancer and is dumped by her boyfriend when he finds out (“I think I have cancer”, “this couldn’t wait until tomorrow?”), then converts when she realizes she is alone in the world.
  • A Muslim student named Ayisha who has secretly converted to Christianity thanks to Franklin Graham (son of Christian Film’s progenitor, Billy) sermons on her ipod, who is then disowned by her father when he finds out.
  • Professor Radisson’s abused Christian girlfriend, Mina, and her struggles with their relationship and the care of her senile mother.
  • Pastor Dave, played depressingly straight by one of Pure Flix’s founders, trying and failing to get to Disney World.

After a long debate, Josh persuades the class that God’s not dead, and the characters from all these plots wind up together at a concert by Christian pop-rock group The Newsboys. Radisson gets hit by a car outside, but as he is dying, Pastor Dave persuades him to convert to Christianity, saving his soul. Before the credits, they scroll through a long list of real-world court cases where Christians have allegedly been persecuted in the US, on which this film is based.

There are many cringeworthy moments, from the Good Christian Girl approaching Ayisha as she’s putting her headscarf back on and telling her “you’re beautiful… I wish you didn’t have to do that”, to Pastor Dave’s missionary friend looking to the sky and saying “what happened here tonight is a cause for celebration” while standing over Professor Radisson’s corpse, but those aren’t good for more than sad laughter.

For all its bigotry, and there is plenty of barely-disguised islamophobia in this film, it succeeds because of its small scale. Instead of having a grand conspiracy from the Arabs to control the world, we have a single abusive father who cannot respect his daughter’s choice. Instead of the literal Antichrist, we have one overbearing professor. The adversaries are on a human scale, and therefore feel more believable.

That believability only makes them more pernicious. The underlying assumption, that all non-Christians are evil and all Christians are good, still holds. It’s just harder to dismiss now that these characters are within our experiences of the world as we know it. The muslim father is physically abusive, beating Ayisha and throwing her out of his house when he finds out she’s converted. Amy’s boyfriend literally dumps her because he doesn’t think it’s fair to him that she has cancer (“you’re breaking our deal!”, he exclaims). Professor Radisson’s friends and colleagues are snobbish, pretentious, and demeaning to Mina over everything from her wine handling to her inability to understand Ancient Greek, which, to be fair, is an accurate depiction of most English majors including myself.

And then, of course, there is Radisson. Unlike the other characters, the writers at least try to give him some depth and development. He is an atheist because as a child, he prayed to God to save his dying mother and it didn’t work. The experience left him a bitter, angry shell of a philosopher who, deep down, still believes in the Christian deity. He just despises him. In the climax, Josh berates his professor with the repeated question, “why do you hate God?” and Radisson finally answers “Because he took everything from me!” This prompts Josh to lay down the killing blow: “How can you hate someone if they don’t exist?”

Score one for the Christians.

 

For a movie billed as “putting God on trial”, God’s Not Dead spends very little time engaging with the arguments. Radisson only ever makes two points with his screen time: he argues that Steven Hawking recently said that there is no need for God in the creation of the universe, and he argues that God is immoral because of The Holocaust, Tsunamis, the AIDS crisis, etc. More shocking, however, is how little time gets devoted to Josh’s arguments /for/ God. We see a brief animation about the big bang and some vague discussion of Aristotle and the steady-state cosmological theory, we see the argument that evolution of complex life happened very quickly when compared to how long only single-celled organisms existed, and we see the argument that without moral absolutes provided by God, there can be no morality. All told, it’s only about 20 minutes of screen time.

Each of these arguments is deeply flawed. While Aristotle did believe the universe was eternal, Maimonedes strongly disagreed with his conclusions, as did Thomas Aquinas. It’s unfair to categorize science as being united in support of the theory. Instead, it’s a perfect example of the scientific method at work: a question was asked, it was debated by countless minds until we developed the technology to test it, and we found our answer. Modern-day forms of animal life did emerge only very recently when compared to the history of all life on Earth, but not only did that process still take millions of years, it also reveals a flaw only in the theory of Evolution as put forward by Darwin centuries ago. Since his time, we’ve learned that evolution likely functions in what’s called a punctuated equilibrium, where evolution stagnates until systemic environmental changes force the life web to adapt. Hell, even Darwin wrote “Species of different genera and classes have not changed at the same rate, or in the same degree.” And the way Josh dismisses the entire field of moral philosophy with a wave of his hand is borderline offensive. Suffice it to say, there are plenty of ethical frameworks that have no need for a God whatsoever, many of which have existed for thousands of years. I’m partial to Rawls’s Veil of Ignorance, to name one.

Nor is Josh the only one committing egregious crimes against the good name of reason. Radisson’s Hawking non sequitur is a complete appeal to authority: “Hawking says there is no need for a God, so God doesn’t exist”. It’s a piss-poor argument, and Josh is entirely in the right when he shuts it down a few scenes later. And as offensive as Josh’s dismissal of moral philosophy was to me, I must imagine Radisson’s appeal to injustice would feel much the same to religious viewers. The question of human suffering in a God-given world is called Theodicy, and it’s as old as monotheism itself. Every faith has an answer for it, which is why so many of the devout are converted in times of intense suffering. If you’re unfamiliar with this subject, check out The Book of Job.

While both are flawed, Radisson’s are unquestionably worse. Josh’s arguments are the same kind of nonsense you will find on the christian right from places like PragerU. Josh even cites major evangelical apologists like Lee Strobel in his presentations. His failures reflect real deficiencies in American Evangelical discourse when compared to the breadth of Christian theology, and it’s understandable that a college freshman might not be up to date on Contractualism or the Categorical Imperative.

Radisson’s, by contrast, bear little resemblance to the anti-God arguments you might hear from an atheist professor. He uses some antitheist buzzwords like “Celestial Dictator”, a favorite of the late Christopher Hitchens, but there is no engagement with what Hitchens meant by that phrase. “Celestial Dictatorship” refers to the fact that many traits Christians see in God, that he knows everything, is all powerful, and judges you not just by your actions on Earth but by the intent in your mind to do good or ill, we see as tyrannical despotism in human beings. It’s a nuanced point that you can debate either way. Radisson uses it as a cheap insult. If this were your only experience with atheists, you would think they were all morons.

Some of this is malice, to be sure. It’s impossible to watch this movie and not see the anger the creators feel towards atheists. But there is also apathy here. Once Josh and Radisson agree on the terms of their debate, the professor moves on to assign the class their reading for the week. Two works: Descartes’s “Discourse on the Method”, and Hume’s “The Problems of Induction”.

Hume never wrote a piece with that title. In fact, while he talked at great length about that topic, as I write this, The first result on Google for “The Problems of Induction” tells us that Hume never even used the word “Induction” in his writings. They could have pulled a list off his works off Wikipedia and gone with any title on it and no one would have batted an eye. But they couldn’t be bothered. Instead, they made one up, based on some vague recollection of the topics Hume covered. The movie doesn’t understand the arguments against the existence of God. It doesn’t really understand the arguments for Him. And it doesn’t care to try, because that’s not its purpose.

 

Its purpose is spelled out in the list of lawsuits it shows you during the end credits. Most of them have nothing to do with the existence of God. Most of them are either about the right to discriminate against LGBT people at universities or misconduct from pro-life groups. I’ve taken screenshots of the list, feel free to peruse them.

You should also note that most of these cases were brought forward by a group called “Alliance Defending Freedom“. The Southern Poverty Law Center classifies them as an anti-LGBT hate-group, and their file on the group is a haunting look into the worst of Christian fundamentalism.

The ADF is the legal wing of the far-right Christian movement, and has been instrumental in pushing the narrative that anti-LGBT-discrimination laws impinge on Christians’ First Amendment rights.

That is the underlying message of this film. It has nothing to do with the existence of God. If it did, it would have devoted more time to the debate. It wants you to believe that in this country, Evangelical Christians are a persecuted minority, spat upon by muslims, atheists, and the academy at large.

And it is their fundamental right, granted to them by God and the First Amendment, to discriminate against The Gay Agenda.

God’s Not Dead 2 (2016)

Cinematically, the sequel is in many ways an improvement over the original. It ditches the first film’s “Christ, Actually” style in favor of a single, unified narrative. While the first movie’s cast of characters (with the notable exception of Josh Wheaton) do show up, their storylines are either much shorter or directly related to the core plot.

They have also changed venues, from the metaphorical trial of a college classroom to the literal trial of a courtroom. A christian teacher, played by 90’s-Nickelodeon-sitcom star Melissa Joan Hart, is discussing nonviolent protest and Martin Luther King when a student asks her if that’s similar to the teachings of Jesus Christ, and she answers with a few choice quotes from scripture. This lands her in hot water with the school board, and rather than apologize for her statement, she stands by her beliefs. The school board brings in the ACLU (“they’ve been waiting for a case like this”), and tries to not only fire her, but revoke her teacher’s license as well. The trial soon expands to cover whether Jesus the man was a myth, subpoenas of local preacher’s sermons, even a case of appendicitis. In the end, the judge finds for her, and the Newsboys play us out with a new song.

I want to begin by making something very clear: the Christians are in the right here. While religion has no place in a science classroom, I can’t imagine any way to teach AP US History without mentioning Jesus. Martin Luther King was a reverend, and his biblical inspiration is an established historical truth.

Also, Jesus existed. Even if you throw the entire bible out, both Roman and Jewish scholars refer to him long before Christianity as a religion took off. That doesn’t necessarily mean that he was the Messiah, or even that most of the stories told about him actually happened. But the near-universal consensus of historians and theologians across the globe is that a jewish rabbi named Jesus Christ was born in Nazareth and crucified on the orders of Pontius Pilate.

This film’s problem is not its inaccuracy or ignorance: it is its slander. I actually spoke with a friend of mine who works at the ACLU about a hypothetical case like the one in the movie. He told me they would absolutely weigh in- on the side of the teacher. The American Civil Liberties Union’s official position is that, while proselytizing from teachers is unacceptable, mentions of scripture in context are fair game, as are nearly all religious practices by students, on- or off-campus.

Each God’s Not Dead film ends with a new list of Alliance Defending Freedom cases allegedly depicting infringements on religious freedom. In several of the ones I saw in this film, the ACLU wrote briefs supporting the ADF’s position. You can see them on their website. They’re not just lying about the foremost advocate for Civil Liberties in this country, they’re stabbing their own former allies in the back.

There is a reason why the filmmakers made this particular organization the villain of this story. And it’s not just because the ACLU is the front line of the legal opposition to their pro-discrimination interpretation of the First Amendment. It goes back nearly a hundred years, to another trial over religion in the classroom: The Scopes Trial.

 

In March of 1925, Tennessee passed the Butler Act, a law banning the teaching of Evolution in public schools. The ACLU, then only five years old, offered to defend any teacher arrested for violating the bill. A man named John Scopes took them up on it, and in July of that year was taken to court for teaching Darwin in biology class. The Scopes Monkey Trial, as it came to be known, quickly became a widespread national story, and the town of Dayton (population 1,701) played host to the finest prosecutors, defense lawyers, scientists, religious scholars, and reporters in the world. The proceedings themselves were a madhouse, and to my knowledge are the only time a lawyer for the prosecution has been subpoena’d as an expert witness for the defense. Some day, I’ll write another post on the events of the trial, but until then I recommend you read the play Inherit the Wind  or watch the 1960 film. Both are based on the actual events of the trial.

Scopes was found guilty, and while he won the appeal on a technicality, the Tennessee Supreme Court declined to say the law constituted an establishment of religion. Indeed, it wasn’t until 1968 that the Supreme Court of the United States ruled that such bans violated the establishment clause. But the trial still stands as the beginning of the end of Creationism in public classrooms. Some of that is warranted: the testimony of prosecution counsel, legendary orator, religious scholar, and three-time Democratic presidential candidate William Jennings Bryan was one of the great debates on the nature of skepticism and the historicity of testament of all time. It’s also one that Bryan lost, badly. The legal victory has done nothing to heal the wounds that defeat inflicted on the creationist cause.

That is the underlying purpose of this story: to relitigate the Scopes trial. It even has a climax featuring an unorthodox move by the defense, when the lawyer calls his own client, the teacher, to the stand as a hostile witness. During that testimony, he harangues her into tearfully admitting that she feels she has a personal relationship with God, and in a masterful display of reverse-psychology uses that to appeal to the Jury that they should destroy her.

In Inherit the Wind‘s version of Bryan’s testimony, their renamed version of him eventually claims that God has spoken to him that Darwin’s book is wrong. The defense lawyer mocks him for this answer, declaring him “the prophet Brady!” Same scene, different day.

Their fixation on this trial doesn’t just hurt them by forcing them to slander the ACLU to keep the same opposition in the courtroom: they also overlook a much more interesting legal plotline. During the trial, the school’s lawyers subpoena documents from local preachers, including all their sermons. This actually happened, and whatever your opinion on LGBT protection laws, it should make you feel a little queasy. The pastors had a legitimate case that subpoena power over the pulpit could be abused to prevent the free exercise of religion, even as their opponents had a case that public figure’s speeches advocating for political ends should be admitted to the public record. It’s a complicated issue which sets the First and Fourteenth Amendments in direct conflict with one another, and could make for a truly thrilling drama. In God’s Not Dead 2, it’s just a footnote. Another way to show the ACLU is evil.

Forget that the ACLU actually opposed the sermon subpoenas.

 

There are many other carry-overs from the original film. It is if anything even more hostile to Atheists than its predecessor. One of the subplots follows the child who asked the question about Martin Luther King and Jesus in the first place. She has recently lost a brother, which has driven her to Christ in secret. Her parents are proud skeptics, and are offended by what they see as her teacher proselytizing in class. They are also completely apathetic to the death of their other child. At least Radisson showed some small semblances of caring for his fellow Man.

Once again, the only plotline featuring nonwhite characters also features a physically abusive father. In lieu of Ayisha, they turn to Martin, a Chinese student who had a minor role in the first film who converts to Christianity and is disowned by his father for it. I am more sympathetic to this story than the last, since anti-Christian oppression is a real and well-documented problem in China, but the lack of other nonwhite characters (besides Pastor Dave’s friend, who has done nothing in two movies) makes the choice stick out all the same.

And it has no problem giving a truly heinous interpretation of our Constitution. The defense lawyer, Tom Endler, begins his case by (correctly) pointing out that the phrase “separation of church and state” does not appear in our Constitution. Instead, it came from a letter to the Danbury Baptists, assuring them that the government would not interfere with their right to worship. Endler then claims that this has been twisted in today’s times to mean that all religion must be excluded from the public sphere. This reflects an interpretation of the Establishment clause shared by much of the Christian right called “Accommodationism”.

 

Essentially, the accommodationist perspective argues that the Establishment Clause should be narrowly read to cover only formal establishment of a state religion, not that it should maintain no religious preferences. They argue that The United States was established as a Christian Nation, though not of any one denomination, and that laws which reflect that ought not be struck down. In particular, they see a role for Religion, in general in the public legal sphere: that it “combines an objective, nonarbitrary basis for public morality with respect for the dignity and autonomy of each individual”. And there are some compelling arguments for the theory. Yet it rests on a tortured reading of the Founding Fathers’ discourse before and after the Constitution was ratified.

They were clear, for instance, that the protections of the First Amendment did not only apply to Christians. George Washington, writing to the nation’s first Jewish congregation the year before the First Amendment was ratified:

“It is now no more that toleration is spoken of, as if it were by the indulgence of one class of people, that another enjoyed the exercise of their inherent natural rights. For happily the Government of the United States, which gives to bigotry no sanction, to persecution no assistance requires only that they who live under its protection should demean themselves as good citizens, in giving it on all occasions their effectual support.”

Nor was Washington the only one who extended these protections to Jews. In Thomas Jefferson’s autobiography, he wrote of a proposed addition of “Jesus Christ” to the preamble of his Virginia Statute of Religious Freedom, a predecessor to the First Amendment. It was defeated soundly, and the bill was passed without the reference. Jefferson writes: “they meant to comprehend, within the mantle of it’s protection, the Jew and the Gentile, the Christian and Mahometan, the Hindoo and infidel of every denomination”.

More specifically, on the subject of Jewish schoolchildren being taught the King James Bible, he wrote:”I have thought it a cruel addition to the wrongs which that injured sect have suffered that their youths should be excluded from the instructions in science afforded to all others in our public seminaries by imposing on them a course of theological reading which their consciences do not permit them to pursue”.

Clearly a man who felt that biblical teachings like those mandated in the Scopes Trial were allowed under the Establishment Clause.

As for the argument that Religion provides a nonarbitrary basis for public morality, that same Virginia Statute: “That our civil rights have no dependence on our religious opinions any more than our opinions in physics or geometry”.

That’s not to say that Accommodationism is a wholly bankrupt ideology. It’s all well and good to say that our Government cannot prefer one religion over another, but there are plenty of cases in which a full commitment to the Establishment Cause would constitute a de-facto violation of Free Exercise. But the version espoused by the creators of God’s Not Dead 2 is not interested in those grey areas. They believe that the United States was established first and foremost as a Christian Nation, and that our laws should reflect that supposed truth.

Or, in their words: “Unfortunately, in this day and age, people seem to forget that the most basic human right of all is the right to know Jesus.”

Where We Stand Today

God’s Not Dead: A Light in Darkness, the third installment of the series, aired in March of last year. It couldn’t even make back its budget, grossing less overall than either of its predecessors did on their opening weekends. Yet the Christian Film Industry lives on, with hundreds of movies available on PureFlix and more coming out every year.

Don’t mistake this for market domination. Even within conservative Christian circles, total interest in these films has never managed to eclipse that of Rush Limbaugh alone. The conservative media sphere is vast, and even God’s Not Dead was dwarfed by players like Fox News.

But that doesn’t mean it is meaningless. Even the third film of the series reached more people than the average readership of the National Review, long seen as the premier intellectual forum of the conservative movement. And unlike Fox News, these movies function largely invisibly: you won’t find them on Netflix, but to the target demographic, they’re sorted by subcategory (“apocalypse” and “patriotic”, to name a few) on its Christian doppelgänger. Their message is being heard loud and clear.

And that message is toxic. The world on display in God’s Not Dead, Left Behind, and all their knock-offs and contemporaries is one unrecognizable to those of us on the outside. It is a world divided into Christians and evildoers, where nonbelievers are at best abusive fathers and at worst servants of the Antichrist. It is a world in which the majority religion of the United States is subject to increasing victimization and oppression on the part of an Atheist elite hellbent on destroying God. It is one where powerful institutions for the public good, the ACLU, the UN, even our high schools and universities have been bent to their will. It is one that tells you that your identity is under attack, and that you must act to defend it.

Whatever they may have been in 1951, today these movies are not sermons on God. They are political propaganda. They are a carefully-crafted cocktail of Christian trappings and conspiracy theories, designed to make its viewers see Muslims as hateful child-beaters, Atheists as amoral oppressors, and the fundamental tenets of liberalism and pluralism as attacks on their faith.

It doesn’t have to be this way. The same year that gave us God’s Not Dead gave us Noah, a biblical story that grossed six times as much and got far better reviews. And even in these films, there are moments of real honesty and insight. In the first Left Behind film, there’s a scene in a church where everyone in the congregation has been raptured, except for the Preacher. We get to watch him, alone with the symbols of his God, processing his loss and what that meant about his faith. It’s a little over-the-top, but it also forced me to put my phone down and think about the ethics of what I was watching. You wind up pondering consequentialism and virtue ethics, wondering why his role in his congregation’s salvation doesn’t count, whether it should if his motives were corrupt, all while watching real grief from a man wrestling with his faith.

Maybe I’m just a sucker for climactic scenes where men call out God in the middle of empty churches. But I think there’s more to it than that. There are good moments, lots of them, in these films. If they wanted to, these directors and actors and producers could force their audiences to confront their own assumptions, to strengthen their faith through genuine interrogation. They could give us a catechism for the modern, more uncertain age.

They can. They just choose not to.

Megarachne

Forty years ago, scientists revealed a fossil of the largest spider that had ever lived. While never reaching the fame of a T-Rex, the foot-long arachnid became a mainstay of natural history museums around the world, unparalleled in its ability to creep out and disturb children and parents alike.

Until the moment it wasn’t.

Decades later, we were told that everything we thought we knew about this animal was incorrect. For twenty-five years, the museums, the documentaries, and the general paleontological community all got it wrong. How did they mistake the creature the first time, and why didn’t anyone notice until 2005?

It all begins in 1980. An Argentinian paleontologist named Mario Hünicken quietly announced an extraordinary discovery. He had found a fossil in the sediment near Baja de Feliz which dated back to the late Carboniferous, 300 million years ago. It was a foot long, and appeared to show most of the animal’s body and three of its legs.

It looked like this.

Megarachne

He named it “Megarachne”, or “Giant Spider”. It did the term justice. Based on this fossil, Megarachne was an ancient ancestor of modern-day tarantulas. With a length of 13 inches and a legspan of over 20, this specimen dwarfed even the largest spiders today, the Giant Huntsman and the Goliath Bird-Eater.

Using a technique called X-ray microtomography, essentially using X-rays to build a 3-D model of the fossil and reveal otherwise invisible details, Hünicken began to learn more about Megarachne’s biology. There were two visible eye sockets, but also a central protrusion that could be space for more. It had an extensive set of chelicerae (essentially, spider’s jaws) at the front, which we see as a bulbous protrusion. They were uncommonly wide and developed for a spider, and may have been large enough to have substantial pushing power on their own, giving the spider more ability to maneuver even large prey to its mouth.

While unimaginable today, a tarantula of this size would not have been that out-of-place in the Carboniferous Earth. The atmosphere was far more oxygen-rich, which meant that arthropods and insects could grow far larger than they can today. This was an era with dragonflies the size of pigeons and seven-foot millipedes. Megarachne would not have wanted for prey.

It’s unlikely this creature spun webs like the Golden Orb-Weaver. Instead, it would have built funnel-like nests from its webbing and waited for a passing lizard or amphibian or large insect to pass by. Lunging out, it would have used those scoop-like chelicerae to grab the animal, then inject it with paralyzing venom. Or, perhaps even more horrifying, it might have wandered the floors of the lush rainforests which covered the world at the time, stalking its prey like a tiger. If it did, it would have to be careful: even this far into antiquity, the largest predators of the time could be as large as your average bear.

Unfortunately, it was hard to tell too much of this for sure. Megarachne was only known from this one fossil, and while several casts had been made for further study, the original, complete with the hidden details Hünicken uncovered, was essentially lost: sold to an anonymous collector and locked in a vault somewhere. It wouldn’t resurface for another 25 years.

As a result, a great deal of what we knew about this animal was conjecture, based on tiny details only Hünicken and a few others had seen. He wasn’t lying, to be clear: casts of fossils can only show the imprint of the object, which obscures subtle details within the structure.

Hünicken’s microtomography had revealed what appeared to be signature spider traits within the partial fossil. Cheliceral fangs which could potentially deliver venom, even a sternum (the underside of the abdomen, when used in arthropod anatomy). They weren’t as well preserved as the rest, so you had to extrapolate a bit, but it looked like they were there. And besides, the shape of the animal was clearly that of a proto-tarantula.

And yet, there were some doubters. Dedicated arachnologists pointed out a number of inconsistencies between Megarachne and other spiders, most notably the suture between the abdomen (back end) and cephalothorax (front end). Sutures are basically immobile joints, and in humans only exist in our skull. It meant that this spider wouldn’t be able to bend its body the way tarantulas today do, like this:

spider bend

You can even see the suture for yourself in the cast at the top of this post: it’s directly above the curved line around the abdomen. There’s a white spot in the middle. It’s more visible in the original fossil, but you can make it out if you look closely.

Even Hünicken himself acknowledged the discrepancy, along with a few others. But they were easily explained away. 300 million years is a long time, after all.

Meanwhile, Megarachne was busy going as viral as a Carboniferous-era arthropod could go. We’ve always had a weakness for giant spiders, and here was a genuine monster. This thing had more than its fair share of museum displays, an unusual trait for its era. Before the dinosaurs, but after the absolute freak show of the earliest animals, the Carboniferous had one thing going for it in terms of public appeal: the giant dragonfly. In fact, much of the time before the dinosaurs gets skipped over when discussing the extraordinary variety of life on Earth, but I digress. The point is, by the 1990s the museums were in the pocket of Big Megarachne.

 

Let’s recap. The creature we’re talking about would have looked something like this:

mn5azc6dxryx

That image is to scale, of course. It doesn’t look like a modern-day tarantula. The body is fused into one big chunk, the suture at work. You can also see the spatulate chelicerae: they’re the two giant growths below its eyes. The smaller limbs between them and the legs are called pedipalps. They’re used to help maneuver prey while eating, and also as tongues and noses. And also penises sometimes. Spiders are weird.

And for about 25 years, that was how it was. Megarachne was a bizarre ancestral spider, made gigantic by an oxygen-rich atmosphere and sporting a set of fangs the size of lightbulbs. A few spider-specialists in the community grumbled that it might have been a Ricinuleid or Solifuge, but for the most part, it was accepted.

And then everything changed.

In 2004, another fossil was found in the same rock formation. It was unquestionably Megarachne: one telltale feature was the identical, and unusual, eye formation. It also looked a lot less like a spider. It’s the middle one in this picture:

rsbl20040272f01

By February of the next year, a new team of Paleontologists, advised by Hünicken himself, published a new paper: Megarachne was not a spider. It was a eurypterid.

The Eurypterids have long since vanished from the Earth, so we don’t have any experience with them like we do for spiders. For that, we should be eternally grateful. Also called “sea scorpions”, Eurypterids dominated the seas from about 450 to 300 million years ago, and lasted for a long time after that. They were filter feeders and apex predators, and ranged from a few inches long to the size of an American Alligator. They would have looked something like this:

pterygotus 2

In a single paper, Megarachne lost all its mojo. Not only was it no longer a record-holding behemoth, it was fairly small for its order. It was also not a developed ambush predator: it fed itself by swimming through riverbeds, using its many arms to capture the tiny invertebrates that lived in the mud and silt.

The scientific community didn’t question the results. The evidence was blatantly obvious, and Hünicken himself had co-authored the paper. Indeed, the speed with which the consensus on the animal changed is an example of science’s greatest quality: the ability to recognize when it is wrong and self-correct.

Yet the legacy of their mistake lives on to this day. Species names are hard to change once they’ve been assigned, so Megarachne retained its name. There is now a bottom-feeding eurypterid, not unlike a lobster, whose name directly translates to “giant spider”. But this paper also came at an inopportune moment for a much larger entity: the British Broadcasting Company. Some years before, the BBC had aired the hugely successful “Walking With Dinosaurs”, a high-budget docuseries narrated by Kenneth Branagh that won three Emmys. It in turn spawned a sequel, “Walking with Beasts”, that catalogued the time between the Cretacious extinction and today, and a prequel, “Walking With Monsters”, which would air later that year.

“Walking With Monsters” is a masterpiece of the genre, and I encourage anyone interested to watch it. Like its predecessors, it’s rivaled only by big-budget action movies in the quality of its special effects, but with a degree of accuracy unparalleled by any cinema. The producers consulted 600 paleontologists, paleobotanists, geologists, and even astronomers to ensure that its depiction of a billion-year story was scrupulously accurate to the scientific consensus. They devoted segments to every era of life’s history before the dinosaurs, and with each one showed not only the path of evolution in our earliest ancestors but also the signature creatures of each epoch. And when it came time to pick the signature animal for the Carboniferous, there was only one natural choice: the largest spider that ever lived.

Seldon, Corronca, and Hünicker published their paper several months before “Walking with Beasts” aired, but it was still too late to change it. Megarachne was the star of the entire segment, there was no way to easily cut it out. Nor could it be replaced. They could find a new animal, but that would mean consulting all the experts again, even more money for the special effects budget, a new script for an entirely different animal, hauling Kenneth back into the soundstage, and countless other barriers. It was either cut a hundred million years from the story of life on Earth, or bite the bullet and air the episode.

The BBC opted for the latter. It is, to my knowledge, the only time it has ever knowingly and intentionally aired fake news.

And so, despite being left behind by science, Megarachne lives on, not in the literature or the museum exhibitions, but in the minds of a generation of impressionable science nerds who saw it fight for survival on the television.

I am not ashamed to say that I am one of them.

Ultima Thule

A few days ago, the world was introduced to this photo:

Ultima Thule

This weird snowman-shaped thing is the most distant place humanity has ever explored.

Its name is Ultima Thule. It was flown by and photographed by the New Horizons probe, which famously got us our first look at Pluto back in 2015. Over the next few years, as scientists closely examine the data it’s transmitting, we will gain a far greater understanding of how our solar system formed and what it looked like five billion years ago. Its value will be immeasurable.

I’m not here to talk about that. Some day, when we’ve actually learned those things and don’t just have initial photographs, I might. In the meantime, I want to talk about something more innocuous: its name.

There’s a lot of politics involved in the naming of astronomical objects. While scientific literature uses certain rules to convey the most important information (Kepler 22b for instance, indicates the first planet orbiting the star Kepler 22), just try asking a fifth grader to remember the name 2014 MU69. The names of the planets have stuck around since antiquity, but what about the new ones we discover? Even within our solar system, new dwarf planets get found in the Kuiper belt (where Pluto is, basically) every year. Should we name them all after Roman gods? What happens when we run out? And why are we still using names from a mythology revered only in western civilization?

Scientists have fixed the last two problems by expanding to new mythologies. Sedna, for instance, is named for the Inuit goddess of the sea. Smaller objects, asteroids and comets like Ultima Thule, are named for meaningful phrases or minor heroes in any number of languages and mythologies. But even that can be fraught. The scientists who named Ultima Thule have since come under fire for using the phrase, which has connections to Nazi ideology. To be clear, NASA didn’t intend that meaning. So why did they pick it?

Ultima Thule is originally latin, and translates to “beyond the farthest land”. Except that it isn’t. “Ultima” from the Late Latin “ultimare”, or “to come to an end”, is, but “thule” is not a word. It comes from Ancient Greek, where it is also not a word. Anecdotally, it looks Germanic. Mixtures of Latin and Germanic words aren’t uncommon in English, but the phrase “ultima thule” predates English.

To understand where the word came from, we have to go back 2000 years, to a man named Pytheas of Massalia. He was a Greek geographer who lived from around 350 to 240 BCE, hailing from a colony that would later become Marseille, France. During his lifetime, he explored the north of Europe in a voyage he later described in his work, “On the Water”.

Sadly, this manuscript has been lost to time. All we have of it are the excerpts quoted by other, later authors. But from those, we can begin to uncover his route.

His first hurdle would have been the Strait of Gibraltar, the narrow passage connecting the Mediterranean and the Atlantic Ocean. At the time, the rival nation of Carthage had closed the Strait to all ships from other nations. Historians have speculated that he began his voyage overland, journeying to the mouth of the Loire and constructing his ship there. However, it is more likely that he went through the Strait, either by running the Carthaginian blockade or by negotiating for safe passage.

From there, he travelled to Britain, likely the first Greek ever to do so (while there are older Greek artifacts in archaeological digs in the UK, it’s believed those were indirectly brought there by traders). There, he circumnavigated the British Isles, mapping the coastline and making first contact with the people there. This is where the name “Britain” comes from: it’s a transliteration of a Celtic word roughly meaning “land of the painted men”, referring to the Celtic tradition of painting their faces.

He discovered the Orkney Islands, to the north. He sailed to the Vistula River in the north of Poland, and the entire Germanic coast of the Baltic Sea, what today is Lithuania, Latvia, and Estonia. He is the first explorer of his people to encounter drifting sea ice. And he discovered a distant land called “Thule”.

By Pytheas’s account, Thule is an island six days’ travel north of Britain. That would put it on the coast of Norway, or possibly Iceland. It was inside the arctic circle: Pliny the Elder reported that it as having no night on the summer solstice, a phenomenon known in principle to the Greeks but never before seen in person. The people there lived on roots and grain, and it is likely their language gave him the name “Thule”.

From there, he sailed northward, but discovered nothing but frozen ocean and was forced to turn back before long. When he returned, the annals of his journeys became the Greeks’ best source of knowledge on these distant lands.

And of those lands the most distant of all was Thule.

This, then, was the source of the word, and of its mysticism: before Kepler, before Magellan, before even Rome, Thule was the last name on the map. Beyond it be dragons.

So far, this story is an tale of exploration and etymology that began and ended thousands of years ago. Where do the Nazis come in?

Today, discussion of Nazism as an ideology and ethos fixates on its genocidal xenophobia and antisemitism, but there was far more to Hitler’s platform than “kill the Jews”. And the other stuff was weird. Among Nazism’s early influences was an esoteric philosophy called “Ariosophy“. It mixed social Darwinist racial theory over the Aryan race and bizarre mysticism, even giving the movement the symbol of the Swastika.  This evolved into a widespread obsession with the occult in the highest circles of Nazi Germany, particularly in the SS. According to Eric Kurlander, a professor at Stetson University, various members of the Nazi leadership believed in everything from Satanism to the claim that the Aryan race descended from the space aliens who built Atlantis. The iconography of the SS, with its skull rings and runic logo, were built on Heinrich Himmler’s fascination with witchcraft and esoteric traditions. He even commandeered a medieval castle and remodeled it to be reminiscent of the Arthurian Grail legend. My personal favorite, however is the idea that the Aryan Race descended from superbeings that evolved from the inhabitants of icy moons that impacted Earth in antiquity like some antisemitic Kal-El, a viewpoint Hitler supported well into 1942.

From Ariosophy and the swamp of mysticism that abetted it came a group called the Thule Society. It was founded in 1918, and centered around the belief that the Aryan Race traced its origin in antiquity to the Hyperborean people, who in turn came from the island of Thule. Each new recruit had to take this pledge:

“The signer hereby swears to the best of his knowledge and belief that no Jewish or coloured blood flows in either his or in his wife’s veins, and that among their ancestors are no members of the coloured races.”

The society didn’t last long: while it’s hard to trace the actions of secret societies, it appears to have been forgotten about by the early 1920s. But late in 1918, a journalist and Thule Society member Karl Harrer persuaded his friend Anton Drexler to form the activist group Politischer Arbeiter-Zirkel, the “Political Workers’ Circle”. In January of the next year, its members decided to create a new party, the German Workers’ Party. A few months later, a young man named Adolf Hitler began attending meetings.

A year after that, the German Workers’ Party had been dissolved, replaced by the Nazi Party that would later kill a third of all Jews in existence.

There is little evidence that the Thule Society itself had any influence on Nazism: the only person involved in that story to actually be a member, Harrer, resigned the party in protest in 1920. True, it was antisemitic, but so was most of Germany at the time. Yet its role in the origins of the great evil of the 20th century has left it inextricably linked with that organization.

Which brings us back to Ultima Thule. The name does not refer to the Nazi’s use. In fact, the Thule Society was far more preoccupied with the island itself than with the ancient phrase for “beyond the known world”. Yet its members, and by extension its beliefs, did play a part in the early origins of Nazism.

On balance, I don’t think it matters. Even if you accept the counterfactual argument that Nazism would not exist without the Thule Society’s input, they are a footnote in the story of that ideology. It was merely a part of the same mystic tradition in Weimar Germany that played a role in Hitler’s rise and the iconography of his movement. So was Thor, for that matter, and we still watch The Avengers.

As I write this, “Ultima Thule” delivers 125,000,000 search results on Google. “Ultima Thule nazi” has roughly 1% as many. Whatever its connotations may have been in the past, the phrase has returned to harmony with its meaning from antiquity. It stands for exploration, for humanity’s innate desire to fill in the blank spaces on the map. It stands for the work of hundreds of scientists to build a machine that could fly 6.5 billion kilometers through the vacuum of space to fly by an object even our most powerful telescopes on Earth couldn’t see and take pictures. It invites you to wonder, if 2000 years ago the name referred to Scandinavia and today it belongs to the most distant reaches of our solar system, where Ultima Thule will be 2000 years from now.

Place your bets now.