To be honest, adjusting to a prosthetic leg hasn’t been too tough. There are things I can’t do anymore, but I find myself able to manage the world. However, this is only because wherever I go, I find reserved parking and ramps or elevators where there might otherwise be only stairs.
30 years ago tomorrow, President George H.W. Bush signed the Americans with Disabilities Act, which guarantees me these things, along with several other protections barring disability discrimination. “Let the shameful wall of exclusion finally come tumbling down,” Bush said.
The law also requires covered employers to make reasonable accommodations for the disabled, so that we can continue to work. Since losing my leg, I think often of the Mad Men episode, “Guy Walks Into an Advertising Agency.” In it, a rowdy office party is held in the office.
As in the case of my own accident, in poor decision where alcohol was no doubt a contributing factor, a secretary comes to be piloting a John Deere riding mower. The party’s mirth is shattered when she runs over the foot of an up-and-coming visitor from the firm’s new British owners…
“Now that’s all over” the man’s boss later tells Don Draper, of the victim’s impressive career as an account man.
“I don’t know if that’s true,” Draper interjects, surprised.
“The man is missing a foot. How is he going to work? He can’t walk,” says another suit, icily.
When I watched this episode air in 2009, I found the notion so jarring; that the mere loss of a foot would destroy an otherwise vibrant career seemed, to me, barbaric. It struck me as just as callous as the practice of euthanizing a horse. How could people talk this way?
The cruelty from these buttoned-downs seemed a relic; beyond me. Pure discrimination.
“But that’s life,” Joan tells Don in the ER, “One minute you’re on top of the world, the next minute some secretary’s running you over with a lawnmower.”
They laugh. It is funny.
Since a drunk driver took my right leg, I understand much more fully the context of the footless executive. Don Draper was the type of boss who would accommodate him, the other man was not. Without accommodations like those I mentioned, people like me cannot achieve much.
But for most of my lifetime, the ADA has been the law of the land.
We are often cynical about politics on the Internet but I know personally that our institutions, however flawed, are capable of making real differences in ordinary lives. I am proof, and I am grateful for it.
Today I thank the many advocates who paved the way, the 101st United States Congress, and the late President George H.W. Bush. Because of them, much of the world is accessible to me.
Nothing is “all over” in my life.
Learn more in the video below, which Google produced for the 25th anniversary:
On the morning of June 15th, comedian David Cross took to Twitter to deliver news to the nearly two-hundred thousand accounts who follow him. Cross has, since the 90’s, featured in countless film and television projects, from family-friendly fare like the Alvin and the Chipmunks franchise to edgier material like Arrested Development. But to many, he will forever be known primarily for being the “David” in “Bob and David,” the driving forces behind HBO’s subversive sketch comedy cult classic, Mr. Show with Bob and David. His partner in that enterprise was the similarly-talented Bob Odenkirk, known best to modern audiences for Better Call Saul. David Cross continues to perform in a variety of projects across multiple media every year.
But when he tweeted this week, it was not a new project that Cross sought to bring to his fans’ attention, but rather, one that is now five years old. In 2015, Netflix revived the Bob and David experience for a sketch show called W/ Bob & David. Reactions at the time seemed mixed, and Netflix only aired one season of the show. Nevertheless, Cross has said in interviews that he and Odenkirk (and Netflix) would like to make more episodes, and that it is only a matter of scheduling. Many (including this writer) would have been happy if it were a new batch of episodes that David Cross was tweeting to announce, but instead, we got this:
The offending sketch was “Know Your Rights,” one in which Cross plays a YouTubing citizen-journalist who seeks to expose police misconduct at a DUI checkpoint. The joke is that he can’t get the cop, played by guest star Keegan-Michael Key, to violate his civil rights in the way he imagines will occur. Like the character, the sketch does not exactly succeed spectacularly, as far as comedy goes, but it’s funny enough. In the context of 2020, it does feel especially in poor taste. It ends, however, with Cross’ character darkening his face and pretending to be black, at which point a white cop played by Mr. Show regular Jay Johnston repeatedly maces him.
If this sketch has anything to say about our modern moment in history, and it would be fair to say we should have no expectation that it does—as there are now five years between us and it—it would be this: it stands for the proposition that police are much more likely to use force against black people. That is the truth about our society which the offending part of this sketch satirizes. In that way, it is entirely in line with the seeming view of the many thousands of protesters who have recently flooded streets in America and around the world. This movement seeks to expose and remedy police abuses along racial lines, of the sort this sketch ultimately highlights.
But to Netflix, it’s just an instance of David Cross wearing blackface. Sure, it was not enough of one that they wouldn’t air it on their platform consistently for the past five years, though the destructive and harmful effects of white performers engaging in blackface has been well-known for several decades (spoiler alert: it predates Netflix’s 1997 founding). That being so, I agree that it was in bad taste to have David Cross, a white man, darken his skin for this sketch. But it didn’t change my view of Cross, his collaborators, or Netflix. I know David Cross is an edgy comedian, but he’s also a famously progressive one, and I probably could never be persuaded that his comedy has negatively impacted marginalized groups.
But in the very fluid culture of 2020, this depiction plays as suddenly very unacceptable, and Netflix chose to act. Hence, a seemingly-unhappy Cross was using Twitter to voice his displeasure at this turn of events. He was subsequently joined in expressing less than a vote of confidence in Netflix’s decision by Odenkirk, and countless others on Twitter, though there were plenty of others who disagreed and suggested the sketch was plainly inappropriate. By the time I saw the tweet, Cross’ link to the video was broken. Netflix had apparently killed it. Here is another, which Netflix will no doubt successfully demand the removal of shortly:
I have often wanted to write something about “Cancel Culture,” a hysterical movement that has crept across the cultural classes for the last decade or so, passing judgment on what entertainments can or cannot be consumed, and which artists can or cannot earn a living, based upon an amorphous coalition of cancellers’ cultural mores. Though often political, no political affiliation renders a person safe from cancellation. Sometimes a person is cancelled because of an accusation of a heinous, actual crimes, and other times for something they’ve said, or otherwise expressed. For Cross’ violation of darkening his skin, no less than comedians Jimmy Fallon and Sarah Silverman, Canadian Prime Minister Justin Trudeau, and Virginia Governor Ralph Northam have all faced intense public scrutiny in the last two years. Like Cross, all four are otherwise known to have progressive attitudes about race, for which they advocate. It is no shock that Netflix would want to pull the sketch, and thereby proactively protect itself against cancellation archeologists sifting through W/ Bob & David to create a new such controversy on Twitter.
I’m like most people. When I see these cancellations occur, I sometimes find myself agreeing with their basis, and other times I’m revolted by their swift silencing of their targets. To focus purely on a spate of cancellations that followed from the appearance of the #MeToo movement in late 2017, I thought Harvey Weinstein was a rapist who belonged in prison; I believed Louis CK had acted inappropriately and should be condemned for it, but I was still his fan; I failed to understand any reason why my positive view of Aziz Ansari should change. But they were all initially “cancelled,” without much due process to speak of before our media culture cast them out. Today, Weinstein is in prison, but the ability of Louis CK and Ansari to reemerge has been mixed. There are clearly still marks on their careers, to one degree or another.
And that’s what makes Cancel Culture different than a boycott. Boycotts have been popular (or at least popular enough for people to call for them) for as long as I can remember, and famous ones, like the Montgomery Bus Boycott, occurred long before I was born. In a boycott, one is urged not to patronize the offending person or organization. A boycott would have called for audiences to shun Louis CK’s shows, movies, and comedy specials. In Cancel Culture, demands are made to to the corporate parents who control those projects to smother them out of existence. The decision about whether an artist’s work is invalid because of who that artist is as a person is thereby removed from the hands of the consumer, and becomes solely a matter of how much you can make executives squirm with threats of damage to their own reputation. If this was purely a matter of David Cross, it’s likely that his other fans (like me) would not abandon him because of the poor decision making that underlied “Know Your Rights.” But for Netflix, a public company with a nearly $200b market capitalization, the bad press that can stem from being branded an employer of racists effectively obliterates whatever tiny bit of revenue even W/ Bob and David as a whole has generated in five years, let alone this five-minute sketch.
In other contexts, the calculus is different. President Trump has engaged in all manner of despicable behavior on Twitter. Twitter is by far the most censorious of the modern social media platforms. This seems like a match made in Cancel Culture heaven. Yet, Twitter has never (until recently, and then even barely) acted to even remark on Trump’s transgressions. Why is this? With his more than 82 million followers, Trump is in the top ten most followed accounts on the platform. But much more than that, even among the accounts (like me) who don’t follow the President, Trump’s tweets drive a substantial amount of activity on Twitter. If they were to ban him, there’s no measure by which it wouldn’t result in some measurable amount of loss in activity on Twitter. There’s no comparing Trump’s productivity for Twitter with David Cross’ for Netflix. Thus, the President remains uncancelled. It really is a slimy financial decision, even for the never-miss-an-opportunity-to-virtue-signal Jack Dorsey.
I haven’t written about cancel culture before because I didn’t believe I had anything worthwhile to say about it. That is to say, I didn’t believe I had any kind of take on it that you hadn’t heard before. I can say I’m mostly against it, because it makes me uncomfortable anytime a mob demands someone be silenced, no matter the circumstances. It doesn’t make matter to me that the speaker offends one’s sensibilities, even my own, whether that be through their speech or their conduct otherwise. My concern is the demand that they be denied an opportunity to be heard. I grew up in the 90’s. I felt this way about gangsta rap. I felt this way about Howard Stern. I felt this way about Pee-wee Herman. In my 37-year life, the cringe I feel when any kind of speech is abridged is one of the few things that age hasn’t moderated in the least. My feeling is that if the speech is unjust or harmful, the responsibility is on us to counter it with our own and expose it as intellectually invalid. Muffling it is never the solution.
If you think speech codes work, history is lousy with examples of them serving to undermine their own purpose. But you don’t have to look to history; I would rather direct your attention to Western Europe, where legal protections are firmly in place to prohibit hate speech, but where the political climate is currently darkening in the shadow of the rise of right-wing nationalists on a scale unseen in nearly a century. Clearly, the well-meaning speech codes are not serving their intended purpose.
And I understand the most important distinction: in the United States, we have the First Amendment to the Constitution to protect our speech rights from ingress by government. Netflix is not (yet) a sovereign, but rather a private streaming-video company, and they can operate their business as they please. Netflix’s removal of this offensive comedy sketch is not the manifestation of any edict by the state.
But it does make me uncomfortable, and here’s why: as I said before, this sketch was not too risque for Netflix in 2015. At some point in the interim, and certainly in the very recent past, after the developments of this Spring, it became too-hot-for-TV, to quote a phrase we once used. Thus, they pushed a button, and the sketch evaporated from Netflix. Zap! And it’s gone. Again, there can be no doubt that the link I’ve provided to another account which hosts a version of it will also vaporize in the near future.
In the realm of corporate censorship, this is actually an interesting departure. It has very little to do with my high-minded moralizing about access to speech, and everything to do with technology. This is where the part comes in about me having something novel to point out.
In the old days, a publisher would print a book. Maybe a provocative one. Then, there would be some movement to ban that book. This happened throughout the 20th century to books that are now unquestioned classics. This list is a great catalog of that phenomenon. But it didn’t just happen with books, there was also music and movies and of course, throughout history, “ideas” that were purely verboten to subscribe to. Some of these harmful notions remain contested to this day, like heliocentrism.
But say you were an ordinary, middle-20th century American living in middle America. The Catcher in the Rye has recently caused a stir in your town. It’s been banned from the schools, the public library, and now your bookstore isn’t carrying it because of the upheaval. Well, in that time and place, you were probably shit out of luck if you wanted to read it. Except for one thing: you bought the book six months ago, and it now rests on a shelf in your living room. No matter the local outcry, the chances are very low that anyone will enter your home against your will, seize the book, and destroy it. It’s possible, but extremely unlikely, particularly if you don’t advertise your possession of it to those who are irate about its existence.
That was the model of consuming media that we all used to live with. Books, records, tapes, newspapers, magazines, photo albums, CDs, DVDs, etc. Media was a physical good which we could possess. The companies who distributed these items parted with them forever when they sold them to us, and as long as we retained them, those companies could not alter our experience of them.
The entire model has now changed. We still consume media as written text, still images, moving pictures, and audio recordings, but we stream these things to our devices. Most of the time, we don’t even download that data in a way where we retain it for storage and future use. Effectively, Netflix broadcasts the episode of W/ Bob & David to us just for the time we watch it, and when we are done, it’s gone. Zap! If we want to watch it again, our device will have to send another request to Netflix for them to send it to us again.
Why does this matter? Because in this model, Netflix CEO Reed Hastings can push a button, and make it so you can never see the “Know Your Rights” sketch again. And basically, that is what has happened in this case. That’s important, because in addressing the question of whether “Know Your Rights” was racist, or in bad taste, or however else we want to frame it, we are going to be substantially at a loss without access to the original material. Reading this piece, or a description on Wikipedia in ten years, is not an effective substitute. These are forums in which my (or someone else’s) analysis will skew your perception of the original material. There’s no way to avoid that.
In each of these cases, the charges are hardly baseless. Like the “Know Your Rights” sketch, these titles feature content that is often in poor taste, or even downright offensive to right-thinking people. But do they actually promote a retrograde world? What is the likelihood, that if these things remained available to stream, they would actually leave an impression on someone impressionable, such that it actually changed that someone’s approach to the world in a negative way? I would venture that this is an unlikely outcome. I would even contend that this is less likely than the gangsta rap I defended in the 90’s was to influence a generation of kids to become gangsters, a non-event that failed to happen, despite the wildly popular form’s sustained embrace by audiences during that time. When Marylin Manson and video games were similarly blamed for the 1999 Columbine Massacre, I scoffed. The monsters who perpetrated those killings had only their own warped minds to blame. The millions of other young minds, like my own, who had consumed Manson and violent video games were adequate testimony of the fact that such things can’t make a person into such a monster.
But by 2013, the winds had really begun to shift about how harmful art could be. That year, after the recent, even-more-monstrous-than-Columbine Sandy Hook massacre, the author Stephen King published this essay. In it, while addressing the controversy about guns in America, he discussed the book Rage, which he had written in the late 70’s under the pen name Richard Bachman. The book is the story (clearly ahead of its time) of a deadly school shooting. The problem for King was that the book had gone on to be an inspiration that many actual school shooters had identified. However this felt, it is a feeling that no author should ever know. It is true that King has written millions of words that detail nearly all the worst things known in the world, and many purely of his own invention, as well many noble things and ones that touch the reader’s heart. But Rage was unique in his experience for having so visceral a connection to actual real-world carnage. He chose to let the book fall out of print. It is difficult to blame him, particularly because it was his decision, and not any effort to cancel him.
The problem is, by letting Rage vanish, in a world where copyright claims are often aggressively pursued on the Internet, nobody can judge how harmful Rage is, for themselves. We cannot read the book. We must take either its critics word for it, or its defenders, or those sick murderers who claim it as their inspiration, or King’s. We can take any interpretation, except our own.
The real question is, can we even have this debate for much longer? I run this website, and pay a small price to have this content hosted here. But in a short time, all the links I’ve embedded this post will become broken. We won’t be able to discuss how racist Gone with the Wind is, because neither you or I will have access to the film to support our arguments, be they pro or con. This is because the film is no longer like the copy of Catcher sitting on your bookshelf, or a copy of Rage, for those who have one. It sits instead on an HBO Max server, somewhere far away, vague, and hidden. We are increasingly living in an age where we can’t own media anymore. We can merely rent it, subject to the whims of those who control the rights to it.
And here’s what makes W/ Bob & David special: if we’re talking about Gone with the Wind, there are millions of copies out in the wilderness. Sure, scarcely anyone has even a DVD player anymore, but we could if we wanted to. W/ Bob & David was a product created by Netflix exclusively to stream on its platform. There are no DVDs, VHS cassettes, LaserDiscs, etc. When Netflix CEO Reed Hastings pushes that Zap! button, its effect is total. He can utterly smite the “Know Your Rights” sketch from the fabric of our culture.
And this is actually happening all over. Not in large ways, but in small ones. The aforementioned Disney Plus edits were noteworthy, but this process of selectively rewriting history, once only famously a tool of the Soviet Union at its peak, permeates our media culture. The New York Timeswill change a front page headline to appease the hardliners in its party. The Times, recently a hotbed of many controversies that could be classed as part of Cancel Culture, even rewrites its own coverage about its internal controversies. And the Times, at least for now, is still reasonably described as the country’s “paper of record.” If it can happen there, it can happen anywhere, and pretty soon you’ll see it happening everywhere. As media, writ large, moves more and more online, and becomes something we merely stream when we want access to it, those who control the content will regularly edit it to serve their momentary interests.
This raised an interesting question, that has never been satisfactorily answered. Who owns the Star Wars trilogy? As a pure matter of intellectual property, surely it was Lucas (now Disney), much like Stephen King owns Rage. But the people who ask this question aren’t talking about traditional notions of American copyright. They are more concerned with the claim they laid on the work when they invested in it, usually as children, with their hearts. Nobody cancelled Star Wars, and there’s no suggestion that anyone would, but the idea is the same: Disney is deciding what fans of the series can experience, for them. If the company wanted to provide every possible version that has existed since 1977 on its platform, it could do so for a pittance. Yet, it does not. It provides one uniform experience, deemed by the company to be the flavor of Star Wars suitable for good people. For the rebels who believe otherwise? Those scum will have to hide from this empire on the dark web, and in other nefarious corners of the Internet galaxy, trading video files of dubious quality, and always running the risk that Disney’s legal department stormtroopers have pinpointed their location.
If the censoring of the “Know Your Rights” sketch was what Netflix was doing behind the scenes on June 15th, what were they doing in front of them? The company announced a $5m gift to organizations dedicated to creating opportunities for Black creators. That sounds like a lot of money (at least to me), and it’s a gesture no doubt meant to show social justice isn’t just a matter of lip service and steamrolling the creative expression of Gen X comedians for them. But this is a corporation that will spend an estimated $17.3b (as in billion) on content this year. If we were looking at a $17.3b pie, and I asked you to cut me a $5m slice, how small would it be? Would it be as large as the line a knife displaces in a delicious real pie? I doubt it. The company reportedly spent the better part of $100m just to secure Martin Scorsese an Oscar this year, a mission I rightly predicted was destined to fail. I don’t think one has to be an utter cynic to question what meaning their $5m donation has.
David Cross is not in danger of being cancelled. At least not yet. His transgressive comedy falls somewhere not quite transgressive enough for a mob to form and demand his removal from show business. But as of today, you can experience exactly one less comic moment from his storied career. Sometimes the sword of the canceller acts swiftly and dramatically. But other times, as here, it merely chips away. It may do so slowly, but in enough time, the original thing is rendered so changed and unrecognizable, that something entirely different is what’s left in the relief. This is what Cancel Culture seeks to remake the media landscape as. This is the only version of life they want you to be able to stream.
Retail is screwed. There isn’t another way to put it.
Actually there are other ways to put it, but that was the least profane one I could settle on after an editing process. Everyone realizes that, at this moment, the retail sector is incurring the biggest disruption in the history of modern markets. Simply put, the stores are closed. So they can’t sell anything. This is not a complex economic problem. Yes, some retailers, those which are deemed essential, remain operating: the grocery stores, and the Walmarts and Targets that have groceries but are mostly for other purchases. Then there’s the Home Depot and like stores.
Everyone knows this is all kind of ridiculous. How do we know a store is essential? It exists. That’s a fact. If a store was non-essential, it would cease to exist. Many have. Borders and Circuit City once existed, because they were once essential. Now they don’t because they aren’t.
I realize that the grocery stores provide actual sustenance, which is truly essential to maintaining life, and that’s a far cry from DVDs and hardbound books, but the powers that be put no limitations on my quest for sustenance where I live, in Southern California. I can go to the grocery store as much as a I like. If I want to buy colored pencils today, and cigarettes tomorrow, I can do it. I now have to wear a face covering, but other than that I’m good to go. We are not living with ration coupons, like World War 2 Americans had to at times. McDonald’s also, is an essential business in this pandemic. This is the reality where I live, but nowhere in the country are the restrictions limiting the consumer to what is strictly necessary to keep being alive. Even in Governor Gretchen Whitmer’s Michigan, where severe restrictions have been placed on many goods, I could still participate in the state lottery, which she has deemed essential.
And because the retail window has narrowed to these places, business for them is momentarily great. I used to work in grocery, and when the panic buying began, a friend still in that business told me that his store was consistently at 400% of normal business. All kinds of items were flying off the shelves in a scene even remotely familiar only to those of us who have seen the lead-up to a blizzard, and the only problem was a supply one.
But it can’t last. And clearly it hasn’t lasted. Every time I go to the grocery store now, it is evident they have a greater and more consistent supply than has been typical for the last two months. Soon, it will be effectively normal, with the odd outlier item, hand sanitizer or various things that count as PPE, being still hard to find, and in enough time even those things will be back in abundance.
But online retail has done great for itself too, in this crisis. The exact numbers aren’t clear, but the unprecedented delays in shipping times and fulfillment advertised on Amazon lately speaks broadly for how much business they’re doing, both in groceries and in the random consumer products they’re better known for. Plenty of competitors have been able to pick up the excess business from customers who don’t want to wait for those delays. So it’s safe to say that online, business is booming too.
In the city where I live, sales tax is the single greatest generator of revenue. And we have bills here. We have to pay them. We have fiscal liabilities past, present, and future, and these taxes go a long way to keeping us solvent. We saw municipal bankruptcies in the financial crisis, and we’ll see them again. I don’t anticipate anything so dramatic in my city, but there will be pain, and it’ll be felt for a long time. Such pain was felt all over the country after the financial crisis, but our economy rebounded. I have zero doubt the national economy will rebound from the current morass too, but that recovery will not be evenly distributed.
The problem is that the current crisis is going to permanently alter consumer behavior, at a time when it was already in a long term trend of being altered. In a recession, people don’t go shopping because they don’t have the money. But once they have it again, they’ll be back. Before the pandemic downturn, it was obvious that consumers were back to behaving about as close to 2005 as was possible in light of newer rules. Consumer debt was back at record levels. Credit card debt had roundly defeated its 2008 record.
Many people were already buying many things online. That’s obvious. You and I both were. But not everybody was. There was a type of consumer who stubbornly stuck to the retail transaction, and who kept many in the retail sector on life support. Call this person Joe. Old Joe. He’s 50-80, and a late adopter of trends and technology. Recently, Joe bought something online, perhaps for the first time in his life, because he wanted a particular thing and he had no other choice, because the place he would have bought it in person is closed. And even if he had bought something online before, maybe Joe had to do it himself this time, because social distancing precluded his daughter coming over to do the work behind the keyboard. And what did Joe learn? Well, this is 2020. He learned this process is easy and cheap, and someone brought the thing right to his door.
So Joe’s behavior will change now. Yes, he will probably go back to the store when it opens back up. But he’ll go a little less. And the way that profit margins work—think small—in retail, particularly in publicly held companies (which most of the retail names you recognize are), a light trimming of those margins can be fatal.
The stores have bills, too. They aren’t camping out on those enormous chunks of land for free. Their employees don’t show up to work for fun, either, despite official corporate policies that often define fun as mandatory. And then, of course, many of them exercise less than the utmost care with regard to debt. Those costs are going to come in very close proximity to catching up with those businesses this year, and in many cases, they’re going to seize their prey. It won’t take that many Joes doing business elsewhere for there not to be enough money on the table at the end of the day for an apparel store, or a movie theater, or a restaurant, or the somehow-still-extant-office-supply-chain, to meet their obligations. Some of those may not seem like retail proper, but more on that later. In any case, you’ll see examples of all of these things fail, and you’ll see Halloween stores breeze in and out, in their places, with the autumn weather. Then they’ll just be husks that remind us our world has shifted.
And as for grocery, and other essential stores, 400% of normal business? Watch what happens next. During the initial panic, a lot of people were unable to get what they needed from the effectively sold-out grocery store. A lot of people turned to Amazon Fresh or a similar online food merchant for the first time. Now they know the ease with which they can order their food and have it brought right to their door. Joe isn’t good on the computer, but he’s learning, and people learn very fast when it’s out of necessity. These stores are much more likely to survive than their non-essential peers, but when the dust settles their market share will have undergone a substantial net loss. Sure, a lot of people, even most, will continue to use the grocery store as they normally did. But a few won’t. A few will give it up forever. And the grocery store has the same margin problem as everyone else. A small bite out of their normal baseline business will be felt through entire companies.
The economic fallout is going to be the real fallout from COVID-19. The public health crisis is truly a nightmare, and a tragedy, and a source of frustration that any person capable of thinking and caring has felt in their heart, whether it touched them personally or not. But know this: the economic fallout will touch everyone, and there will be no exceptions. Even the consistently and wildly rich will be slightly (or substantially, depending on how they are positioned) less rich. The poor who were struggling already will be cruelly forced into S.O.S. territory. And everyone in between will have their own experience, none of them a positive one. Only random, cunning profiteers in this crisis will have anything to brag about, and if they are wise, they will refrain. The appetite to eat the rich never subsides, but it is prone to growing from quiet and unassuming to suddenly loud and fevered.
And it’ll matter when we lose retail. It’ll be a change in American life. It’ll be a lot like how a previous generation watched the long, slow death of manufacturing, and ironically, manufacturing will reappear, at least for a trial run, as scenes of the hurt this crisis put on the global supply chain become starker.
I am from a small town. I moved there when I was three years old, and there was nothing in that town. There was just a sleepy downtown, and everything else was set in endless acres of wood. Over the course of my adolescence, I watched it change. They built houses, but the biggest footprint that reshaped the landscape was retail. Stores, restaurants, a movie theater—these things all count for the purposes of this discussion, particularly because the defining characteristic here, unlike in prior discussions about how online retailing was the thing killing conventional retail, is businesses where the public congregates. That’s going to be what keeps people home, not the mere ability to buy a thing cheaper or with more convenience.
I think we generally agree that restaurants will remain essential. It’s nice to go out to eat. You can’t stream this experience. People need to go on cute dates somewhere. But many of the restaurants are mall-adjacent, so to speak. They sit in between big stores and little stores, that up until six weeks ago processed thousands of transactions every month, kicking a little off the top of each to various levels of government.
The movie theaters are less essential. On the contrary, I’ve seen their demise as what’s essential, even before the pandemic. They’re a nice place for a cute date too, but there’s a unique problem: they’re enormous. One of the reasons smaller movies have disappeared from theaters is that their losses aren’t just measured in the amount of people who don’t buy a ticket to a Tim Roth movie, but in the tickets that could have been sold to a Marvel movie, if it were playing on that screen instead. But even those movies packing the house can’t heave the enormous burden of justifying the footprint of the megaplexes. The prices have risen sharply, as most of us who aren’t children join a growing consensus that the product has gotten worse. They’re not making movies I want to go see, or where they are, they aren’t showing them in the theaters. I desperately wanted to catch Martin Scorsese’s The Irishman last year, but it didn’t come to any theaters in my area until well after its Netflix premiere.
And that’s the other problem. Streaming has been breathing on the theater exhibitors’ necks for years, and in this pandemic it has almost certainly finally taken a bite. Netflix picked up an astounding 16 million new subscribers last month. Joe is probably scrolling through all those choices—none of which will cost him any more than his monthly fee—for the first time. He’s realizing that this sure beats paying nearly $20 for one movie that he had to fight the crowds just to get into, after a long trek in from the parking lot, only to be disappointed by its poor quality.
And okay, not everybody is Joe. Lots of people really miss the movies right now, and they want to show back up once it’s safe to. But again, consider the margins: those theater owners need to sell a lot of fucking popcorn just to keep the air conditioning on. It won’t take that many Joes or their children bowing out of the Friday night rat race for a huge bite to be taken out of that business. If you add up all the people who decide streaming is better, and all of the people who are going to simply too afraid of germs, it’s going to be a lot of people. You probably feel one way or the other about this, but if you feel like you’re ready to go back to the movies now, try this thought experiment: they’ve reopened the theaters, and you’re sitting in the first movie you’ve been to in months. The lights go down and people stop chatting, and that intro starts playing and you’re happy to feel some semblance of normalcy again. Then, in the darkness, a cough. How do you react? What do you feel at that moment? Do you tell yourself, “it’s probably nothing?” What if it’s the person right behind you? What if it’s you? Do you notice that every eye in the place suddenly turns in your direction? Are anything like these scenarios a way in which you can comfortably watch a movie?
And retail otherwise? It’s not even worth addressing. Neiman Marcus is said to be headed into bankruptcy while I write this. Like tomorrow. So imagine a mall, and the big anchor store vanishes amidst its corporate bankruptcy, and the movie theater quickly follows behind it, in a whirlpool that sucks in all the smaller stores, because it’s depressing to go to one of those fucking malls where only every third store is open for business. Yes, Chili’s is still open. You can get your frozen margarita and Awesome Blossom. But that mall is pretty big. That Chili’s, and the Olive Garden, and whatever else they’ve got going on there isn’t going to justify devoting that enormous plot of land to commerce, even if it seemed impossible to go wrong 20 years ago.
So they’ll bulldoze it. They’ll flatten it. That’s what’s happening: retail, literally and figuratively, is undergoing a flattening right now. And it will be a domino effect that touches things a world away from retail. In my city, we’re going to feel those lighter pockets. We, who lived through 2008, all understand what systematic shocks to the economy are. But that won’t make it feel any better when it creeps into our own place of work, retail or otherwise, and exacts its price on us.
Not everyone agrees with me. In fact, if you read or listen to market voices, many clearly do not. But one point I should make is that if we were speaking strictly in analogies, this would be a breakup, not a death. Nobody is denying that people are going to keep buying things. Likewise, in 50 years, I have every reason to believe you’ll still be able to visit something called a store in your community, where you can do this. The point is more that everything as we know it will change, more than it already was changing. Whatever amount of stores exist in your city in 50 years is going to be far fewer than are needed to fill up either an indoor mall or a strip mall of the size that currently dot the landscape in America. So effectively, what that means is that there is no getting around the point that cities and developers are going to have to adjust their plans toward revamping that landscape into something else through broad changes, just at the time when these changes are forcing a resource crunch on both. My only advice would be to start early. If you look at the places in America where manufacturing was most thriving in 1950, it’s not going to make you optimistic.
If you’ve ever gone through a bad breakup, you know that some version of Elizabeth Kübler-Ross’ Five Stages of Grief is real. I say some version because personal experience tends to really vary on this concept, but few would deny that its broad strokes are relatable. Generally, these stages are: denial, anger, bargaining, depression, and acceptance. Before the pandemic, the economy was flirting with bargaining: many malls were being reinvented as mixed use, often with a living component. But some municipalities were stuck in denial when it came to the approval of these projects, for a host of reasons, but one undoubtedly being the revenue that only retail delivers to local government. While we’re likely entering an economic depression at the moment—and it should be noted that outside of the 1930’s, American economic depressions don’t usually last many years—the retail sector is likely to be positively schizophrenic in the stages it oscillates between in the next, say, five years. There will be denial for some. Others will move quickly toward acceptance. There will be seeming hops back and forth between every stage in between for many, while other parties will try to shop the stages à la carte. It will not be pretty, for anyone. This is an industry that directly employs 29 million people in this country. U.S. manufacturing only employed 19.5 million at its 1979 peak. You can imagine how this would be a substantial disruption, even if I’m only a quarter right. Ultimately, the market will arrive at acceptance, because acceptance is the stage that rises to meet you, even when you resist it. In a breakup, a person who is truly incapable of acceptance is, sooner or later, going to be met with a restraining order. For parties with an interest in retail who are too stubborn to embrace acceptance, it’s going to be the forced dispossession of their assets, and this is likely to include municipalities, where they are truly incapable of change.
There is hope. There will be normalcy. But it will be eventual normalcy, not quick and easy normalcy. The whole economy is changing. There isn’t even time to talk about demand for office space being a certain casualty of this global experiment of learning who can and can’t work from home. Nor is it worth going into how needing to sanitize and cover your face is going to negatively impact the user experience of all in-person business once something called normalcy finally reemerges.
But you’ll adjust to that too. You will persevere through all of this, and this will qualify as something worth telling your grandkids about. You’re allowed, right now, to live day to day, learning as you go, and making mistakes as you wouldn’t choose to, but nevertheless must. That’s what we’re literally all doing. If you listen closely, you’ll note that even the experts (even the ones we laud) are learning as they go, and revising their guidance accordingly. Challenges are the norm. A life of complacency is not. If you’ve made it this far, your luck is pretty good.
I buy used things. I buy them as much as I can. Some things, a person must (and should) buy new. Food is one. Medicine is another. These things are consumable once, and then they exist only in your memory. But generally, in life, the objects you interact with on a daily basis can be reused again and again, and there’s no reason to dispose of them. This is the frugal approach to living in a consumer society.
If only that were my reason. I’m not, by my nature, a frugal person. I once bet $500 on a hand of blackjack while unemployed. And won. Don’t worry, my lifetime losses far outnumber my gains, at least at gambling.
But the objects I surround myself with have generally been owned by another person. I buy all my clothes at the Salvation Army, Goodwill, or similar stores. The only exception is my suits, which are hard to size in the used market, but I nevertheless own several used suits. I drive two used cars, one a 2007, and the other a 1992. Sometimes people question the wisdom of this, and I’m fond of saying, “Do you know what the most reliable kind of car is? A whole other car that’s sitting in the driveway and still working.” But once, in 2013, I owned three used cars that were all in need of work at the same time.
The walls of my apartment are lined with records, tapes, and books, almost none of which I bought new. The dishes in my kitchen are used. A lot of the furniture in the house is too. I used to buy only used furniture (aside from mattresses, because that’s gross) but since losing my right leg in 2017, it’s been harder to carry large things on my own. But I sill do it when I can, which is too often.
There are exceptions, which I cling to like a snob. I buy New Era fitted hats at a rate of roughly two a year. I beat my sneakers to death, but I buy them new at those discount stores. Technology is something I try to keep up with, so my computer, phone, and TV are all relatively new. But these things are purely utilitarian: my shoes take me to the interesting places I go, and I measure the years in the wearing out of my hats. Technology is the way I communicate with the broader world. All of these things facilitate my interactions with other people, which is the most interesting part of any life. Even in movies about the last person on earth, what’s interesting is the lack of interactions and what that makes the person do, who we then interact with as an audience. Of themselves, they’re nothing.
But I find a ton of things to be interested in simply because they are secondhand. Case in point: I own three cats. The oldest one is named Solo, and he was a few years old when I got him, after his former owner (a guy about my age) died of cancer, and Solo was then abandoned by that man’s girlfriend at a vet. The youngest one, who is called Fancy, was a stray picked up and put in the animal shelter last summer. She caught my eye there when I saw her rolling around in her own (clean) litter box, perhaps misunderstanding what it was for (she doesn’t do this now). The middle cat was just an 8-month old kitten (“new”) when I got her, but she already had a personality—her name was Bell, and I was told this is because she “will sing out at you first thing, just like a bell!” This has proven true, she greets me almost any time I come within a few feet of her.
Before I had cats, I had a beautiful German Shepherd named Daisy. After my accident in 2017, I was in bed for a year and in no position to take care of a dog. Today, she lives with my parents on the other side of the country, who have a yard big enough for her to feel like she’s living the American-dog-dream of homeownership (which is lived by digging holes), and I have no interest in taking her away from it. But she came to me with a story too, having been abandoned by persons unknown in Los Angeles a decade ago. Even though somebody didn’t want her, it didn’t affect her personality. She has a wonderful temperament, and a heart of gold, even though she’s tough as nails when she needs to be.
All of my pets had their names when I got them, and I didn’t change them, because they had a story before they met me and I didn’t want to forget it. I know these few details I’ve said about where they came from, but I can never really know because they can’t tell me—but it’s interesting to speculate.
This is what is true of all my used things. They have a story that began before they were ever mine. This is the thing that makes them interesting, because despite their other aesthetic or objective qualities, they have a third dimension exclusive to those things which have existed in the world for some time.
I have my own story which I invest my things with, and it is interesting, but it isn’t interesting to me. Life doesn’t work like that. This is kind of like how no matter how attractive you are, you could never be said to be dating yourself. If you’re thinking otherwise, I would advise you get your mind out of the gutter. Some people think narcissists (or sociopaths) might be able to find interest exclusively in themselves, but I don’t think even this is true. Human beings want to go to on a journey outside of themselves. This is innate. This is why you can look at a globe and point to places where people are living on every part of it, despite the fact that countless ones had to die to get there. If it were any different, we would never have ventured out of Africa.
And venture I do, in my own way, as a modern man. I can spend upwards of an hour in the Salvation Army looking at all these objects and wondering what they meant to someone once, even the ones that I would never buy. Just the T-shirts are fascinating: if you’ve never browsed a place like this, I think you’d be amazed by all the evidence that exists there of forgotten retirement parties, 5k runs for charity, past contenders in the playoffs or for political office, McJobs people once had, school graduations happily discarded, and much, much more. Sure, it’s a thrift store, but it’s also a library. It’s a place that catalogs information about the past which nobody needs, but I learn from it. I’ve walked into a store like this in the middle of Iowa or Indiana and walked out knowing a hell of a lot more about the kind of things that go on in that town.
I’ve always been this way. When I was 5, I loved Star Wars more than anything. It’s easy to forget now, when that brand is so ubiquitous and popular, but between the mid 80’s and the mid 90’s Star Wars disappeared from the culture almost completely. You couldn’t walk into a Toys “R” Us and buy Star Wars paraphernalia. Most people thought, with good reason, that George Lucas was never going to make another movie. Hollywood and Hasbro had their minds set on pushing new brands. But I really loved Star Wars, so on weekends my dad would take me to yard sales and the flea market (what they call a “swap meet” out west where I live now), and we would search for these things. It was like panning for gold. Oftentimes, we came up empty, but because that was the case it really felt like the discovery of a treasure when I did find something. When a yard sale yielded me a Han Solo action figure in Empire Strikes Back “Bespin” gear, I felt something like I one day would hitting that blackjack hand. When that Han Solo was accidentally decapitated, it was a tragedy. Don’t worry, my mom glued it back on—only Han no longer could rotate it to look to his sides.
Those places were great, for five year-old me. I had all kinds of weird hobbies. I had endless comic books from the 60’s and 70’s. I had Golden books. I collected 8-track tapes. I had a stamp collection. I had a coin collection. I knew the term “collector’s item,” and liked believing that’s what my things were, but they were generally cheap or worthless at the time. Ironically, I see many of them fetching a high price today. I saved a lot of them into adulthood but lost, discarded, or destroyed a lot more. But you know what? It wasn’t a bad way to get an education.
I was not a good student. I couldn’t sit still. I thought my teachers were often tyrants and more often not very smart. Some were kind, and really tried to “save” me in that earnest way unique to a certain kind of public educator, but I successfully defended myself from any saving. I mouthed off at will. I was fat, and when other kids bullied me, I fought back, and pretty soon you don’t know whether they’re starting the fights or you are. When you become a person who’s used to swinging your fists around, not surprisingly, you find yourself swinging your fists around at the drop of increasingly less important hats. They wanted to keep me back a grade, or kick me out, or start me on Ritalin, or start me in special ed, all depending on which hapless administrator was tasked with my problems whenever they boiled to a level beyond ignorable.
But like I said, I did manage to get an education. Here is one example:
Do you know the story of General Douglas MacArthur? He was a five-star general in the United States Army who served in both World Wars. A general must serve in a world war to attain that fifth star, which is why there are none living today. Only nine have ever attained this rank, and MacArthur was one. He was once the Field Marshal of the Philippine Army, and fought the Japanese there during World War 2, receiving the Medal of Honor for that service. While he was reputed to be heroic in his time, upon a closer examination of his ability as a tactician and especially because of subsequent service after that war, history has not judged MacArthur well. Today, he is often cited as a flamboyant egotist. This is mainly because during the Korean War, he effectively decided that he, and not President Truman, was the Commander in Chief of the armed forces, and led an unauthorized campaign that nearly began a third world war with China. Truman then famously fired him in perhaps the greatest confrontation between military and civilian leadership in American history.
Maybe you knew this story. Despite my poor aptitude for matters academic, the aforementioned are facts I (roughly) knew as a five year-old. Here is why, and also a part of the story you might not know: during the Second World War, MacArthur was such a force to be reckoned with in the Far East that money was printed in the Philippines with his picture on it. These notes, which were denominated as pesos, are now called “MacArthur Guerrilla Money.” I learned this fact at the same time I learned all of that other information about the General which I have already imparted.
This is because when I was five, I went into a coin shop with my dad in Plymouth, Massachusetts. It was filled with interesting things I could not afford to buy. But specifically, I asked the owner about the wadded up money I had noticed in the corner of the display window that faced the street. Unaware, he went there and pulled it out—it was several five-peso bills with MacArthur’s face on them. He explained to me what they were, who MacArthur was, and that they were worthless. He said I could have them. I learned more about MacArthur and his Korean War firing from my dad on the car ride home. Today those notes are worth more than worthless, but not much more: they’ll run you about $15 each online.
This is just an odd curiosity about MacArthur, but I think it provides shading to the broader historical picture of him. Is it surprising that a person who, largely by historical accident, found himself so important as to have his face printed on currency (and there were coins minted too) would decide himself qualified to reject a president’s orders and unilaterally direct the military destiny of the United States? Perhaps not. There are others strange anecdotes that paint this picture of the man, many of them from his time between wars directing the occupation of Japan, but this is the one the I grew up knowing about.
The major movement in American education for the past two decades has been toward STEM being the only thing worth studying. Sure, this is important, but it really has come at the expense of the wisdom that only history can teach, in addition to disregarding other important subjects. I have a calculator on my phone that has been able to solve every math problem that I’ve ever encountered as an adult. How the millions of lives lost during the twentieth century to fascism and communism bears—and doesn’t bear—on the kind of policies being debated in present-day America is a much more elusive answer to track down. Also, I suck at math.
But this, by asking questions and conducting my own analysis based on the objects I encountered, is how I, who didn’t take so well to school, grew up to be an educated person. Today, the internet would teach you these sorts of things, but in 1987 we didn’t have anything like that. We had the public library, which was much more tedious. The kind of rabbit hole that Wikipedia will streamline you down in 15 minutes these days would have taken hours of cross-references and rampant shooshings back in the library of the 1980’s.
Do you find the story of MacArthur Guerrilla Money interesting? I always did. I could tell you a thousand like this. You might think I’m wasting my time in the thrift store, and I might be, but I don’t live this way for nothing. I don’t come up empty-handed. Like Horatio at the end of Hamlet, the players might be mostly dead, but I walk away with the story to tell. Of Hamlet, there are multiple used copies in my home, just in case I ever find myself needing to perform the play with others on the spot.
The stories I learn are personal ones too. My daily driver is a 1992 Buick Roadmaster sedan. When I bought it in 2019, it came with extensive service records. I am the third owner. It was originally purchased in 1992 by a couple named George and Audrey. For it, they handed over $25,799.45, in cash, buying the car outright from Hatfield Buick-GMC in Redlands, California. While financing cars was already standard by 1992, George and Audrey were born in the 1920’s and it was hardly unusual during most of their adult lives to purchase a new car in full at the dealership. While 1992 was the first model year of this era of Roadmaster sedan, General Motors knew even then it was emblematic of a bygone era, with its body-on-frame construction, rear-wheel drive, and 5.7 liter V8 engine. They stopped producing the car in 1996, and have not made one like it since.
George and Audrey parted with a large sum to own this car, and they treated it with the proper care due a thing of its value. To this day, it is in remarkable shape. It has only 129,000 miles on the odometer, but this is partly misleading: in the summer of 2005, when the car was thirteen years-old and had driven a hundred-thousand miles—when most people would have simply replaced it—the couple paid nearly five-thousand dollars to install a brand new engine. So my 1992 car technically has a motor with less than thirty-thousand miles. In 2018, then in their nineties, the couple (through their daughter) sold the car to a man in Riverside who tuned it up, installed new brakes, and then sold it to me for three-thousand dollars.
I’ve never met George or Audrey. I know this story purely from the documentary evidence. But when I’m rushing around and busy, and soda cans and fast food bags start to collect on the floor of the front seat or a film of dirt develops on its enamel and chrome, I remember these facts and the care that these people put into the vehicle, and I feel inspired to get off my ass and clean it. This is a special car. Last year, I was rear ended on the freeway by a newer-model Chevy truck. The accident caused one of my reverse lights to crack, and my rear bumper to be slightly pushed in, but otherwise the car was intact and not in need of repair. That truck’s entire front end was crushed, its headlights and grille shattered into pieces, and its bumper detached and hanging. I don’t intend to part ways with the Roadmaster anytime soon.
Conservation is great too. Some people say my car uses too much gas, but I would point out that I drive 20 miles a day on average, and that as much of the carbon pollution that any car creates comes from its manufacturing and as its emissions, so unless one drives a used hybrid or electric, they’re vehicular carbon footprint is probably deeper than mine. This is a really convenient argument for me. You can probably tell I’ve thought about it.
When I was a teenager, and into my early twenties, I went through a phase of favoring things that were new. I was painfully insecure and afraid of people judging me. I started teenage life largely true to who I was, but also literally large, weighing about 320 pounds. After I lost half of that by the time I was sixteen and realized a woman one day having sex with me could be more than just a hypothesis, I started shopping at the mall a lot and worrying about keeping up with what I thought of as “fashion.” I bought flannel shirts for $50, which were mainly differentiated from the ones at the Goodwill by a brand tag on the front pocket so small as to be almost imperceptible. I never stopped trolling the thrift stores, but it felt like a part of my life that should be hidden from polite company. Painful experience proved me wrong about this. The kind of people who would judge me for this turned out to be a kind of person who I was wholly unable to satisfy, even if I was giving 110% at lying about who I actually was. Lying about yourself is a strategy that does work in life, but only temporarily. If you’re never going to see somebody again, you can tell them almost anything you want to. Any deviation from that, and things begin to be more complicated.
Overall, I do feel good about reusing goods where I can, because even absent an environmentally-minded or intellectual curiosity analysis, I think it’s sad when something is discarded forever. It’s very sad for me to imagine that every stuffed animal I ever had as a kid is probably now buried in a landfill, somewhere in the middle of nowhere, doing nothing but taking up space. I hope, for no particular reason, that some future archeologist finds reason to unearth them someday. Yeah, mine is a sentimental way of experiencing the world, but without sentiment we would just be living-machines, and even all of our scientific endeavors and technological progress would exist only to maintain and perpetuate ourselves as such. I don’t want to live like that.
United Artists had recently needed to answer that question not just for Cimino, but also as it applied to Martin Scorsese. The director had begun work in 1979 on what endures as one of his masterpieces: 1980’s Raging Bull. As odd as it sounds now, Scorsese was then possessed of the notion that Raging Bull would be his last picture. This was part of the reason that he, too, fought hard with UA, particularly in the grueling and complex editing process, which the studio perceived to be taking too long. The movie tells the true story of boxer Jake LaMotta, and the boxing scenes, in particular, needed to be perfectly arranged as to how both the shots and sound were cut. For this, Scorsese worked intricately with (among others) Thelma Schoonmaker, the brilliant editor of most of the director’s work (up to and including The Irishman), whom many believe a key to his continuing success and a definite piece of evidence supporting Kael’s collaborative view of film as an art. Raging Bull’s box office gross barely exceeded its $18m budget, and the movie polarized critics, but today it is universally recognized as a contender for possibly the best work that both Scorsese and star De Niro (as LaMotta) have ever done. Many (including the American Film Institute, who put it at #24 on their list) consider it one of the greatest films of all time. De Niro took home Best Actor for the role, but for Best Director and Best Picture, as usual, Scorsese was shut out, this time by Robert Redford’s Ordinary People.
Raging Bull is obviously very different than Heaven’s Gate. But when considering the end of the New Hollywood, both function as strong reasons why it wouldn’t be in the interest of a studio to ever indulge a picky director again. Bull, I should mention, is a film that UA gave Scorsese fully eighteen million dollars in 1979 to make in black and white. It was not commercially-minded. If Scorsese did think it was the last big feature he would ever make, he probably wasn’t alone.
In that New York Times piece, Scorsese devoted some space to defending the fact that he would publicly criticize Marvel at all:
So, you might ask, what’s my problem? Why not just let superhero films and other franchise films be? The reason is simple. In many places around this country and around the world, franchise films are now your primary choice if you want to see something on the big screen. It’s a perilous time in film exhibition, and there are fewer independent theaters than ever. The equation has flipped and streaming has become the primary delivery system. Still, I don’t know a single filmmaker who doesn’t want to design films for the big screen, to be projected before audiences in theaters.
The words are timely, nobody could deny. The box office data every weekend (and every year) is a long list of franchise films, remakes, reboots, reshoots, and reshapes, but seldom is it ever traceable back to an original idea. The MCU is only part of a bigger trend. Even if you removed it from the equation, only the most naive would think its absence would be filled by small, intimate films, or even grandiose original ones. It’s very difficult to imagine Avatar, a film I despise, for which 20th Century Fox put up $237m about a decade ago, getting made today. That film was not based on a book or prior movie, but just an idea James Cameron had. Cameron, mind you, was at that time the reigning champion behind the biggest box office success of all time, Titanic. But it’s hard to imagine even that movie, with its $200m budget and untested stars, getting made today either. Both of those movies, it was feared, would prove to be big budget disasters on the order of Heaven’s Gate. Jordan Peele’s Us received a lot of attention for being a big success this year based on an original concept, but it was one he delivered for only $20m. However you want to measure it, Scorsese is right. The moviegoing experience has permanently changed and we are not given choices that were widely available not long ago.
But what nobody is going to tell him, because anybody in that conversation has an interest in this not being the case, is that movie exhibition is probably just an untenable business. He wants to see his movies, and ones like them, played across the big screen, and that’s going to be less and less practical as the 21st century progresses. 100 years ago, it was the only way to watch a movie (other than through a nickelodeon), so hordes of the public could be counted on to fill the theater. There was no TV, and barely even radio, let alone anything by which we consume the internet in all its forms today. Space, in any American community, was not at a premium either. Even when Scorsese started making movies in the late 60’s, the Brooke Amendment capped public housing rent at 25% of the resident’s income, a benchmark that economists judged to be wise for people at all income levels to apply to their housing cost. It was raised to 30% in the 80’s, and now since nobody wants to raise it further, we who rent (and many paying a mortgage) can merely laugh at the fact that economists once thought of this “rule of thumb” as practical, as we write checks that threaten to split our bank accounts like so many James Cameron’s Titanics. The point being, space costs more now.
Where I live, in California, we are said to be in the middle of a “housing crisis.” It is certainly true that we have a large homeless population, and a much larger population that can barely afford to rent, and then another population always in danger of losing a home they own. Our real estate prices are very high here, but nowhere in the country are they cheap. This market (like the one in 2006) could easily crash, resulting in a major haircut to many pieces of property. However, the fire sale price is still going to be quite high. The question is this: how many square feet does your local movie theater comprise? How many acres does it sit on? Is there another use for that land that’s going to generate more money than the showing of movies?
Sadly, the answer to this last question is yes, no matter who you are or where you live. 100 years ago it was different, because not only was the land of a relative lesser value, but a movie theater could count on plenty of people from your community showing up, even at odd hours. I used to go to matinees in the 90’s and they were much fuller than I find them to be now, when I bother to go. Which is rarely. I wanted to see Joker this year, and I could have, but I repeatedly put it off because it was not convenient. I waited for it to appear on iTunes (before it was available on a streaming service or to rent) where I paid $20 to own it, because it was still a better deal than going to the theater.
The theater experience has changed. It’s not the place I used to go in the 90’s to watch Scorsese classics like Casino or Bringing Out the Dead. The movie theater seems to want to be all things to all people these days. It’s a restaurant. It’s a bar. It has heated seats that recline. You select where you want to sit ahead of time. And so on and so on, all of it implicating a single concept: desperation. This would all be fine; a new, preferable experience, except for one thing: they want us to pay for it. It’s crazy how much it costs to go to the movies now. Even if you don’t indulge in all those luxuries, you’re still looking at paying somewhere between 10 and 20 dollars for the privilege. A quick perusal just now of my local theater, an ordinary, non-luxury one, reveals that if I want to catch Bad Boys for Life, the number one movie this weekend, at 7:00pm, I’ll need to pay $13.45. I can buy my tickets online, in fact it’s even necessary to, if I want to get a good seat, but for that I must pay a “convenience fee” of $1.79. This, in a world where e-commerce is ubiquitous and known for costing less, not more. For my local Major League Baseball team, the Anaheim Angels, tickets start at only $9. And you can see the action from any seat in that stadium. And somehow, something about this experience is supposed to be superior to sitting around in my underwear, in the privacy of my own home, while Jaoquin Phoenix terrorizes an environment you might call pre-Giuliani Gotham City. And My 50-inch 4K Smart TV only cost $300.
Scorsese’s right. It is better to watch some movies on the big screen, perhaps the ones that are least like the MCU and arguably the most cinematic—though I know that sentence would enrage Marvel’s acolytes. I desperately wanted to see The Irishman (literally the only movie I was excited about this year) in the theater before it came to Netflix, but it didn’t come to my area. So I watched it at home the week of Thanksgiving, and I still loved it. It wasn’t a perfect movie, but none is, and he clearly still has the touch. But was it worth $140m? Not my $140m, if I were asked to underwrite it. But as just one thing I enjoy with my Netflix subscription, a product I use almost daily? This is a product I pay $12.99 every month for, less than that Bad Boys ticket would cost me tonight. If I want to go out, well, I live in Southern California and it’s beautiful nearly every day of the year. I can go to the beach for free.
And this picture I paint is what theatrical exhibition has to compete with. Scorsese, for a guy who likes the big screen, can’t look at the big picture. The Marvel movies are going to go the way of the dinosaur too. It won’t be long before nearly every theater in this country is either bulldozed or repurposed into something else. The real irony is, that when this happens, movies like Scorsese’s are the only ones that are still going to play in theaters. Specialty theaters will survive. Los Angeles is full of them and that won’t change. The Egyptian in Hollywood—which is just now ending a month of running Scorsese and De Niro’s collaborations together—will be doing what it does for a long time to come. Quentin Tarantino’s New Beverly Cinema, where you can often catch great movies from the New Hollywood period, is going to last too. Nearby, the ArcLight, which shows a mix of big and small movies and would have been considered a megaplex just a generation ago, will probably last. This will always be a big town for the movies. But the Marvel stuff will all transition to streaming. That’s why Disney+ had such a massive rollout this year, and why the company is all too happy to price it at $6.99 a month, well below most of their competitors. They’re trying to seize the real estate that they still can: in your living room and on your credit card bill.
Scorsese wrote his piece as though he’s unaware of this inevitability. You can’t blame him, because when the New Hollywood died, it went very much the way he thinks franchise pictures are going to go. A young person today might not know it, but they built the megaplexes specifically to accommodate the franchises. When I was growing up, any city of any reasonable size had multiple theaters: the new one, that showed six, eight, maybe even twelve different movies, and at least one smaller theater that had only one or two screens. Today, the latter is all but gone from American society. This is a land use trend that is broadly occurring in our society: an entity (here, the megaplex) that, if we’re honest, we remember as having destroyed a small business in our lifetime, is seen as an institution, the loss of which sparks a reactionary response as though it threatens our communities. You’ll see it soon happen to the malls, to the extent it isn’t already. I’ve seen at least multiple posts on social media in recent weeks bemoaning the loss to my community of an Albertsons store, a division of the Supervalu corporation, who had revenue on the order of $12.5b last year. I haven’t lived here that many years, but I know Albertsons hasn’t, either.
What Scorsese doesn’t say, but he must know, is that he once contributed to the displacing of a cinema that many held sacred. The old studio system, or the Golden Age of Hollywood, was just about finally dead by the time Scorsese started making movies, but there were plenty of people leftover who were sad to see it go. Golden Age iconography is still with us today, even if mostly relegated to its late-stage greats: Marylin Monroe, James Dean, Elvis. These stars and their films are still held sacred by a lot of people. I doubt Scorsese, growing up when he did, would tell you otherwise. But there was not room for both him and those things to exist in the New Hollywood. In 1970, somebody thought they were going to be the next James Dean, specifically, and you’ve never heard of them because there just wasn’t a market for that anymore. Instead they sold insurance, or maybe died of a drug overdose instead. Perhaps both.
The point is, change is sad. It’s not always sad for you but it’s always sad for somebody. And for somebody else, like Martin Scorsese in 1976, it’s great. But not for Martin Scorsese in 2020. For lack of a better way to explain how society and culture operate, this seems to just be the way things go.
The popular narrative is that the 80’s were a rough time for the kinds of filmmakers who thrived in the New Hollywood. The money-people wanted more like Star Wars, and less like Heaven’s Gate, and who could blame them? Again, Heaven’s Gate ended up costing $44m in a year when the impossible-to-fail The Empire Strikes Back cost only $23m. So the 80’s were unleashed upon the public in a spectacle as close to Scorsese’s scenario where “people are given only one kind of thing” as you’ll ever see. Some of it was good: we got E.T., three Back to the Futures, and three Indiana Jones movies. Some of it was bad: we got seven Police Academy movies. Some of it was mixed. We got Schwarzenegger, Rambo, and enough horror franchise pictures to fill an entire October and then some. It was a big time for movies. Teenage boys fueled the expansion of the franchise industry through “repeat viewing,” and video tapes grew it even beyond that. There was a lot of work if you were in show business.
There was even work for Michael Cimino. By 1985, Dino De Laurentiis was willing to give the director $25m to make Year of the Dragon with Mickey Rourke. To show all those people who called The Deer Hunter racist, Cimino crafted a movie that was as racist against Asians as any I have ever seen, though in fairness, much of the criticism heaped upon it in this vein are things which should apply to Full Metal Jacket and all those other Vietnam movies that have somehow escaped such scorn. However, some credits (including Roger Ebert and Rex Reed) did like it, and Tarantino calls it one of his favorite movies. But it bombed, and after Cimino wandered away from its still smoking ashes he took $16.5m and made The Sicilian, an adaptation of Godfather-writer Mario Puzo’s novel of the same name. Ever the francophile, Cimino cast the very un-Sicilian Christopher Lambert (yes, from Highlander) in the titular role. I wish that were the only problem with this very troubled movie, but Cimino’s ego over keeping the film’s running time extra-long led, this time, to a dramatic court battle with the producers. Cimino lost, and a judge endorsed the view that the director had lied by hiding information from the Sicilian’s producers about his actual contract on Year of the Dragon. The Sicilian now stands at a tight (for Cimino) 115 minutes, but I don’t recommend any of them to you. It proved a substantially more severe bomb than Dragon, earning only $5.4m. Then he took another $18m and used it to create the Mickey Rourke bomb Desperate Hours in 1990. Even after all of this colossal failing, Regency Enterprises entrusted Michael Cimino with $31m for something called The Sunchaser with Woody Harrelson in 1996. The internet insists, from all the sources I can track down, that somehow this movie earned only $21,508.00, a number too specific to be a lie. Little else can be said about The Sunchaser (because I haven’t seen it), aside from this portion of the film’s Wikipedia page, which I reproduce here in its entirety:
Mickey Rourke, a collaborator and friend of Cimino’s, believes the director “snapped” sometime during the making of The Sunchaser. “Michael is the sort of person that if you take away his money he short-circuits,” Rourke says. “He is a man of honor.” Rourke did not say how or why Cimino “snapped.”
Joe D’Augustine, the film’s editor, recalls his first meeting with Cimino: “It was kind of eerie, freaky. I was led into this dark editing room with black velvet curtains and there was this guy hunched over. They bring me into, like, his chamber, as if he was the Pope. Everyone was speaking in hushed tones. He had something covering his face, a handkerchief. He kept his face covered. And nobody was allowed to take his picture […] Welcome to Ciminoville.
After that, Cimino became reclusive, and radically altered his appearance. He got much thinner and had extensive cosmetic surgery, though he refused in interviews to elaborate on why, aside from claiming some vague problem with his teeth (Ciminoville, indeed.) There were rampant rumors that he was transitioning genders, and all of the press which reported them speak in tones for which any reputable journalist would be fired today. The 2002 Vanity Fair piece is particularly offensive. At one point, the reporter writes, “I tell [Cimino] I have heard that many transsexuals regret it or simply go mad. ‘How could you not?’ He says. ‘How the fuck could you not, for Chrissakes?’” Obviously, these were rumors the director fervently denied. What his particular gender identity was, I can’t say. In the final interview he gave before he died, The Hollywood Reporter’s Abramovitch raised the issue again, pointing to the how far society’s views had come since 2002, but Cimino appeared to grow agitated and called the subject “bullshit.” Whatever Michael Cimino’s internal perception of himself was, there’s no denying it caused him to undergo an extreme physical transformation late in his life. Commenting on facial surgeries, this is what appears in the 2002 Vanity Fair piece:
Then there is the matter of jaw surgery. “I didn’t have the right alignment of my jaw,” he says. “My teeth are fine, but my mouth is too small. The doctor reshaped the whole back of my mouth, and re-aligned all my teeth. I had all kinds of braces, and shit and crap. A couple of years ago I had a guy do the front.… And what it does is, it changes your face, it changes the shape of your face. So, in addition to losing all the fat on your face, the slightest thing you do to your teeth alters the shape of your face.”
Look at the pictures here, and tell me if you think this explains the difference between them, or if you know a person who has had an experience similar to what Cimino described. I don’t, and I don’t. Whatever his more detailed views on the self he crafted were, it would appear he took them to his grave.
Then, in 2016, at the presumed age of 77, Cimino died. It was not immediately known how. He was not known to be ill. Variety wrote in their obituary for the director, “His career is a cautionary tale for Hollywood, about the eternal conflict between artistry and finance, with side battles between creative people and the media.”
Anyone who’s seen The Deer Hunter remembers its final scene. In a film full of scenes that linger in one’s memory long after the details of an ordinary film have faded from it, it is one of the most downbeat and least visually remarkable, but nevertheless it has a powerful force. The main characters have returned to the local bar that serves as a kind of centerpiece to the first act, to commiserate after a funeral that they could not prevent, despite Michael’s desperate attempts to. Spontaneously, they break into song, but the moment is unlike perhaps any other in film that those words describe. The song: “God Bless America,” and while it isn’t an ironic moment, it isn’t any kind of straightforward display of patriotism either. It’s very hard to describe. It was polarizing from almost the moment the script was written. When Cimino showed the film to Universal-parent-MCA’s chariman Lew Wasserman and his lieutenant, Sid Sheinberg, they were extremely upset by this scene. To Vanity Fair in 2008, producer Barry Spikings recalled Sheinberg as having said something to the effect of, “You’re poking a stick in the eye of America.” When it was actually released, with a solid segment of the critical community viewing it as some macho celebration of the war, an opposite interpretation of the scene was expressed, while others were generally confused about whether it was ironic or not. Personally, I take none of these views. To me, it is patriotic, but it contemplates the fact that patriotism is a weird baseline in American identity that we take for normally granted, and only really emerges to seem significant at times when we don’t know what the fuck else to do, which is the position the characters are in when they sing the song.
This is very much how post-9/11 patriotism felt. I was 18 and I lived in Massachusetts. Both of the planes that hit the World Trade Center had departed Logan Airport in Boston, about 40 miles from my house. One of the planes had been a Tuesday, 8am American Airlines flight to Los Angeles. My dad and I had tickets for the same flight for the following Tuesday, so that I could visit USC, where I hoped to eventually study film. There was no particular reason we chose one week over another. That trip never happened, nor did any academic career for me at that school. Nothing like this had ever happened before and it all felt very real and close to our lives. On 9/11, in the early evening when all the planes had been grounded and it seemed like a day that had kept getting worse after it appeared it couldn’t possibly get worse was finally done getting worse, I went to the town common with two friends, and we held American flags. We were immediately joined by a random group of girls whom we did not know. They had made signs. I don’t remember what they said. But we held the signs with them. We all waved at cars. We didn’t know what the fuck else to do. Nobody felt any particular way except for a general sense of dread. When I watch the song in the final minutes of The Deer Hunter today, this is what I think about. They may have had no actual connection to the war all, but somebody who made The Deer Hunter really understood something about what real life actually feels like.
In July of 2016, many years after Pauline Kael had taken her last breath, Richard Brody eulogized Michael Cimino in the pages of The New Yorker. He said a lot of things about the director’s work that don’t ring true for me, and probably just as many that did. He simultaneously diminished the importance of The Deer Hunter, calling it the “least visually distinguished” of Cimino’s career—an absurd claim that leads one to suspect his impressions of the director’s oeuvre are less than recent—while devoting fully half of his entire piece to this film. But there was one thought in Brody’s statement that perplexed me: he believes that The Deer Hunter owed all of five the Academy Awards it won to that final scene, and the rendition of “God Bless America.” I do not know whether I agree or disagree with this assertion. Unlike so many of the film’s scenes, there’s nothing in this one that “drives” the narrative, but at the same time, if you removed it, I can’t deny the possibility that I might walk away having had an entirely different experience of this movie. I could speculate for the rest of my life and never know the answer, and this statement is one that could apply so broadly to Michael Cimino as an artist and a person.
I don’t share the preceding information just to put a coda on the Cimino story. All the the contradictions and questions, however impossible to answer, are really key to understanding this period in cinematic history. In the New Hollywood, a movie didn’t have to be sure of what it meant. But if there is a point to be drawn from the story, a takeaway or log line, it is this: Michael Cimino directed four films after Heaven’s Gate, with combined budgets totaling over $80m. This is how much money Hollywood was willing to trust Michael Cimino with in the 15 years after he lost their $44m on Heaven’s Gate. It is truly a remarkable business.
But if Cimino got that many chances, when did the New Hollywood die? Francis Ford Coppola was making Apocalypse Now (as good a story as anything you’ve learned here about Cimino can be found in the documentary Hearts of Darkness) during the same time The Deer Hunter and Heaven’s Gate were in production (it took that long.) It performed admirably, if not becoming quite a huge hit. Nevertheless, Coppola kept making movies, some of them great, most of them awful. Coming Home director Hal Ashby died in 1988 after a long battle with substance abuse, but he kept making movies with tens of millions of dollars of other people’s money almost to the very end. Even Chinatown director Roman Polanski, who literally had to flee the United States to avoid rape charges, never to return, continued to make mainstream, if artsy, Hollywood movies with large budgets. He just had to shoot them in Europe. Nobody’s career ended when the house lights came up for Jaws, Star Wars, or Heaven’s Gate. The New Hollywood never died, and plenty of iconoclastic new directors interested in making high commercial art, like Oliver Stone and Ridley Scott, joined its ranks. The popular narrative about the greatest era in American cinema ending is mythology.
But the 80’s also had an undercurrent of extreme art cinema. Earlier, I mentioned the work of David Lynch and of Jim Jarmusch. Both are directors who have flirted with mainstream movies, but generally did (and do) specifically subversive work with a dedicated cult following. The work most associated with Lynch are his dark, surreal, unconventional narratives, like Eraserhead and Blue Velvet. Jarmusch’s early work, like Permanent Vacation and Stranger than Paradise, are downbeat tales of ordinary people. For all the art credibility that the New Hollywood had, there was really no room in it for directors like these. Viewed most cynically, many of the most groundbreaking New Hollywood films were cravenly commercial—Bonnie and Clyde is a gangster picture in the tradition of many tried and true to the genre that came before it. Easy Rider cashes in on the adopted identity of a generation that loves to be marketed to, and the million dollars they invested in licensing contemporary music for the movie speaks volumes to this point. The Graduate does the same thing, with its tale of directionless new boomer adulthood and use of catchy Simon and Garfunkel tunes. Even Ashby’s bizarre Harold and Maude relies heavily on Cat Stevens to get its point across, whatever that is. And in every case, it works. These movies might have done things nobody had ever seen on film before, but they wrapped those things in packages they hoped would appeal to the masses. Even Coming Home and The Deer Hunter capitalize on the country’s latent need to reconcile itself with Vietnam, now that the war has been three years in the rear view mirror.
Neither got it right. Neither Cimino, Jane Fonda, Robert De Niro, or Hal Ashby fought in that war, and there were always going to be limitations on how functional their portrayals were. None of their work matches the verisimilitude of infantryman Tim O’Brien’s 1990 story collection, The Things They Carried, or of Bronze Star and Purple Heart recipient Oliver Stone’s Vietnam movies. I don’t believe any non-soldier artist got it right, in that era, until 1982. That was the year Billy Joel released his album The Nylon Curtain, which included the song “Goodnight Saigon.” Like Coming Home and The Deer Hunter, this song makes it about the soldiers, not the war they fought. But more like Deer Hunter, it doesn’t have to be viewed through a particular political lens. Joel confines his inquiry to US Marines, and their many sacrifices, because in 1982 (and this is surprising because it isn’t long after 1978) the war is long gone, an economic recovery either is or isn’t beginning, and people especially don’t want to remember a “wrong war” that was thankfully ended a decade prior. But the price paid by the people who served in it will never be refunded. Of the song, Joel said he “…wasn’t trying to make a comment on the war, but writing about the soldier as a person.” That person is all we have left of that war. We don’t have the peace movement that strove to end war and save lives, as Jane Fonda knew it. And we don’t have the global struggle against the violent spread of communism, as perhaps Michael Cimino saw it. We are only left with our veterans, whose post-war struggles are still a daily reality for many of them, and a sometimes reminder for the rest of us when our lives intersect with theirs.
And the commercially-viable art cinema of the 80’s, if subdued, persisted. Many great directors, even commercial ones, emerged in that decade. Steven Soderbergh made Sex, Lies, and Videotape. The Coen Brothers burst onto the scene with Blood Simple. Spike Lee brought a truly unique style to movies that would be often imitated in the next decade, but never matched.
And the decade that followed that one was like nothing that had ever come before. Independent (or “indie”) film became all the rage in the 1990’s. It was overhyped, of that you can be certain, but it also brought countless new talents to film, who made movies that, without that overhype, probably never would have seen the light of day. Richard Linklater showed us an alternate universe focused on the hidden gems of mundane life, and anything but that last adjective would describe his lasting career. Paul Thomas Anderson’s Boogie Nights proved that even a movie about the porn business can be told on an epic scale, and that Marky Mark was secretly a great actor named Mark Wahlberg, before Mark Wahlberg proved through many subsequent choice that he didn’t need to leave Marky’s particular good vibrations entirely behind. Overall, the 90’s really was an exciting time to be a fan of the movies. It was the decade that introduced America to former Golden Girls extra Quentin Tarantino, the video store clerk who absorbed a huge swath of influences like a sponge, and transduced them into razor-sharpened moviegoing experiences.
Tarantino is going to beat Scorsese in the vote. Both for Best Picture, and Best Director. I’m not certain he’ll actually win the awards. I think Best Picture is a longshot (1917 is Oscar-bait if ever I’ve seen it, and Parasite has really impressed people), but a Best Director prize would be very deserved for Once Upon a Time in Hollywood, a movie that celebrates that uniquely fossilized Hollywood that Scorsese and his ilk displaced, at a time when it was already a sinking ship. Tarantino is known for being a very particular, even artistic director, who nevertheless delivers, when he releases a movie, a commercially uncontainable product that succeeds at the box office in spite of not fitting an action or kids’ franchise-mold. People just really like his movies. Once Upon a Time in Hollywood has grossed $373m worldwide since its release in late July of 2019. In a world where The Irishman had gotten every bit of the big release Scorsese preferred, Tarantino’s movie still would have destroyed him. I don’t see how it can be argued otherwise.
The MCU will never dominate the Oscars, of that you can be sure. Black Panther‘s nomination for Best Picture last year was purely to make the Oscar telecast more about the kind of movies people actually want to go see, in the hopes more people would tune in, and it didn’t work. It won’t be repeated. These awards, in which each category is voted on by peers in that field (actors vote for the acting awards, writers for writing, and all for Best Picture), are reserved for creative achievements that go above and beyond adequate, and above all, the Marvel movies strive for adequate. Adequate to conquer the megaplex is still a version of adequate. Scorsese is right when he says there are no surprises in these movies, at least if you’re over 12 years-old and you’ve seen a few of them. “Many of them are well made by teams of talented individuals. All the same, they lack something essential to cinema: the unifying vision of an individual artist,” he wrote in the Times. This is what the New Hollywood that birthed his career was all about, and like that wannabe James Dean in 1970, they’re just not buying what he’s selling like they used to. I still am, but I’m 37, and even in my demographic I admit I’m not mainstream. Some of the other directors he name-dropped in the Times are very hip among people my age, like Ari Aster or Wes Anderson, but Scorsese makes movies for my father.
Martin Scorsese is 77 years old. No one would blame him for retiring, like many of his peers are doing. The sweeping changes I described earlier; the bulldozing of the megaplex, this is not something Scorsese is likely to live to see. I don’t know how likely I am to see it. But I would be guarantee it’s inevitable. But 77, 87, or 97, my hope is that Martin Scorsese keeps making movies. His movies, to me, are the best ones, and my life would have just a little less meaning if I had never been able to experience them, and it will be a loss when I know I have seen his last one. If Scorsese had retired after Raging Bull, we would have no Goodfellas, no Last Temptation, no Casino, no The Aviator, and these are just the ones that spring into my head when I consider this dystopian alternate universe. I am happy not to live in it.
The awards don’t matter. He already has one at home. Like everyone who puts their whole heart into their life’s work, he knows how good he is what he does. He doesn’t need me to tell him.
If I had a million dollars, I’d trust the guy with it.
That was the question facing United Artists as they undertook to begin production on Heaven’s Gate. The studio, which had been founded by D.W. Griffith, Charlie Chaplin, Mary Pickford, and Douglas Fairbanks in 1919 to give actors control over their own interests, had been on a winning streak in the New Hollywood. The last three Best Picture winners (One Flew Over the Cuckoo’s Nest, Rocky, and Annie Hall) had all come from UA. Although the studio had also made plenty of very commercial fare, including Some Like it Hot and all of the James Bond films. But following a dispute with parent company Transamerica Corporation over autonomy, chairman Arthur B. Krim and senior management resigned in 1978 in order to form Orion Pictures. Transamerica wanted to start fresh, rather than promote from within. They installed Andy Albeck as president of UA, with Steven Bach and David Field as vice presidents of production. It was these latter two men who would oversee Heaven’s Gate, a film which they hoped would continue the studio’s track record at the Oscars.
But first that track record had to be continued at the 1979 ceremony, and they had high hopes with the Jane Fonda vehicle Coming Home. Coming Home was, like The Deer Hunter, a Vietnam movie. But it was wholly different from Cimino’s. Coming Home never goes to Vietnam, but instead, as the title implies, concerns the aftermath. Jon Voight plays a wheelchair-bound veteran in a VA hospital in Southern California. Fonda plays a bored, conservative housewife whose husband (Bruce Dern) is an officer serving overseas in the war to advance his career. When Fonda volunteers at the hospital, she meets Voight, her past high school classmate, and they eventually begin a passionate affair. They hang out at the beach and drive around in a vintage Porsche. There are some sad things that happen, and the film does depict PTSD, but ultimately Voight’s character finds meaning through the anti-war movement, and Fonda’s jilted husband ends his life by swimming naked into the Pacific Ocean. Scenes of Voight giving earnest anti-war advice are intercut with Dern’s suicide. As you see Dern’s naked buttocks, Voight is saying (I’m serious), “And now I’m here to tell ya, I have killed for my country or whatever, and I don’t feel good about it…I’m here to tell ya, it’s a lousy thing, man…” The lines aren’t as tacky as I make them sound, and there is more to it—Voight’s performance is quite moving. But it all does play as typifying a cliché, when viewed today.
As I disabled person (I have one leg), I can tell you this is outside the ordinary incidence of an injury like Voight’s. I’m still waiting for a beautiful woman to enter my life in a vintage Porsche, nearly three years after my injury. The Vietnam veterans I know, all but one of whom I would describe as broadly against war like Fonda was, have never shared with me memories of an experience like this either, though I haven’t asked specifically. The Deer Hunter‘s Steven (John Savage) also comes back from the war necessitating a wheelchair (both his legs are amputated), but his experience in the VA hospital is decidedly less uplifting than what you see in Coming Home.
It’s not like Coming Home is a bad movie. It’s a good one. It is (spoiler alert) quite a bit better than Heaven’s Gate. It is also the work of a superbly talented New Hollywood director (Hal Ashby, who did Harold and Maude and The Last Detail), and the film was the product of a very earnest desire to make an important statement about the war. Fonda was inspired to create it after meeting the disabled veteran and peace activist Ron Kovic, author of Born on The Fourth of July, which Oliver Stone would adapt a decade later. Kovic told the actress his sex life was better than ever, now that he was in a wheelchair, and I’ve never read his book, but Coming Home differs considerably from the experience communicated by Stone’s movie. Nevertheless, Fonda was motivated to translate Kovic’s message on the screen by having Voight’s character perform oral sex on her, which she saw as a feminist statement, and which became a considerable debate with Ashby after both of them learned that some paralyzed veterans could get erections. The final product is a mixed, somewhat confusing, probably limited by the era, combination of both those visions. But it’s a good movie, and the Best Picture nomination was deserved, as was the Best Actress trophy Fonda received for it.
But it’s very different from The Deer Hunter in at least one important way. Coming home is an anti-war movie for anti-war people. It is as much about the anti-war movement and how it perceived itself as it about the traumatized men who came back from the war. It’s also littered with contemporary hippie music that overshadows much of what’s happening. This is unsurprising. Jane Fonda was a peace activist during the period and her politics are well-known. They are especially well-known because, in 1972, she visited North Vietnamese soldiers in Hanoi and posed on an anti-aircraft gun with them. I always thought this was particularly weird for a person associated with something called the “peace movement,” but other than that, it happened long before I was born and I never had much of an opinion about it. But I can’t help but cringe realizing that this same army was at this same moment torturing John McCain in this same city. Fonda has publicly apologized multiple times for the incident, but protesters still confront her over it to this day. If there was a Twilight Zone about an anti-war activist getting trapped in an ironic hellscape, it would look a lot like this. I don’t doubt the sincerity of her apologies, but I don’t know how necessary they were either. In a war where many, many young people did many, many things they can’t take back, hers were far from the most blameworthy. For that kind of person, the one who saw Jane Fonda as more than a propaganda tool for the North Vietnamese, but rather as some kind of traitor of consequence, there was no way Coming Home was going to be a legitimate statement of anything. But if there was a way for the movie to win unconverted hearts and minds in 1978, I don’t think it tried very hard. It mostly confirms for a certain kind of person that what they already believed is right.
The Deer Hunter does not mythologize the war, even though that’s how many seemed to perceive it. It’s the story of three men whose lives are ruined by it, and the ripple effect that has in the aftermath. But it takes a slow path to get there. Cimino makes us watch (for too long) how happy and full of promise these people were, how naive, and how changed they are when it turns out that war is not like a John Wayne movie. But by doing it in its quiet way, without trying to insert a lot of contemporary political context, it has the effect of narrowing its message to being a personal one, which was the most effective anti-war message even in the era of Vietnam. This is a movie that can show people who supported the war the price it cost the people who fought it. Nowadays, most people can appreciate that war is a condition that has lifelong effects on the people tasked with it. But the popular narrative was once otherwise, even in my lifetime. The old view held that, Vietnam was “the wrong war,” and in the right one, these things would not have occurred. Both The Deer Hunter and Coming Home concern the permanent effects of trauma from war, but only is about them, without the veneer of an exciting, modern sex life, and that Porsche.
Why The Deer Hunter is so effective at this remains a mystery. Coming Home was definitely the product of a noble attempt to tell the actual stories of veterans, and there was an extensive research process that included talking to many of them. No such inquiry preceded the writing of The Deer Hunter. In what one can only imagine the speaker does not comprehend the possible insensitivity of, writer Washburn described the process for the 2008 Vanity Fair piece like this:
“I had a month, that was it,” he explains. “The clock was ticking. Write the fucking script! But all I had to do was watch TV. Those combat cameramen in Vietnam were out there in the field with the guys. I mean, they had stuff that you wouldn’t dream of seeing about Iraq.”
This, coupled with Cimino’s lies about his military service, is a definite indicator that the film was not overly concerned with authenticity. How this was a recipe for a great film which many veterans saw as true to their experience, who knows? Art, and its effect on people, is hard to explain.
But at the 1979 Academy Awards it didn’t matter. The Deer Hunter beat Coming Home for Best Picture (breaking UA’s winning streak), and Cimino bested Ashby for Best Director. Fonda took home the Best Actress award. Cimino said publicly (more than once) that afterward, he rode in an elevator with her, whom he wished to congratulate, but she refused to even make eye contact with him. I give this story a 40% chance of being true. Fonda, publicly, has simultaneously taken the positions that: 1) The Deer Hunter was racist; 2) “Our Film Was Better,”; and 3) that she has never seen The Deer Hunter. Coming Home also took the Best Original Screenplay prize. Voight won Best Actor (over De Niro), however, he later morphed into an arch conservative, and repudiates Coming Home today.
When Heaven’s Gate was seeking to cast its lead actress, trouble arose. Bach and Field wanted Cimino to choose a big star, possibly even Fonda. Cimino became insistent the role go the French actress Isabelle Huppert. The executives objected over concerns she could not speak English, but made a deal with Cimino that they would travel to France to give her a special screen test, and if they could understand her speaking the language, he would get his way. If not, he had to cast somebody they approved of. The test happened, and Huppert was awful. When Cimino was confronted over the phone about breaking his word, the director “…told me to go fuck myself, and hung up,” as Field later put it.
But after that Oscar win, there was no stopping Cimino, at least as far as UA president Andy Albeck was concerned. Heaven’s Gate was approved (including Huppert) with an initial budget of $12.5m. Cimino and company, which included producer Carelli, cinematographer Zsigmond, and a top flight group of actors such as Walken (who himself had taken a Best Supporting Actor Oscar for Deer Hunter), Kris Kristofferson, and Jeff Bridges, headed to Montana to shoot the movie.
Pauline Kael was not a fan of The Deer Hunter. Kael, of The New Yorker, wrote a review that acknowledged the film’s overall quality while being unable to resist taking dig after dig at Cimino that seem predicated mostly 0n what she doesn’t like about America in 1978. She even questions why nobody is gay among the Pennsylvania steelworkers or soldiers in the war, but through the prism of 2020, you wonder what she means, specifically. Are we certain none of these small-town men are gay, in this time and place? How would we know? Does she expect one them to say “Boy, this Vietnam War is nothing like that pride rally.” Personally, when I watch this film now, there are three scenes that convince me that John Cazale’s Stan is a closeted gay man. He’s constantly asserting his heterosexuality and at one point goes on a homophobic rant, but you get the impression he is trying fervently to convince Michael of that heterosexuality when Michael isn’t questioning it. Ironically, Kael later would face harsh accusations of homophobia for various things she had written about gay men (mostly for implying that they were, by default, promiscuous) in her reviews over the years, though this assertion in her Deer Hunter review was not among those objected to. Critiquing Deer Hunter, Kael mostly takes issue with how masculinity—which she calls “boys’-book values”—is promoted in the film. I think this take really ignores how the movie ends. Michael is the only one of the three veterans whose life might ever again look like it did before, but the third act is a series of his attempts at heroism failing utterly. I ascribe this fact at least some value in ascertaining what this movie means. Kael specifically notes that Cimino is not a “great director,” but that criticism is misleading. A “great director” to Pauline Kael, in 1978, didn’t mean the same thing it meant to many others. Pauline Kael wasn’t impressed by the auteur theory.
Pauline Kael was most famous for being an elitist. She wrote movie reviews for people who like to read books instead of going to the movies. But she made her reviews enough like books, with enough references to literature, that the erudite-yet-reactionary readers of The New Yorker wouldn’t be startled out of their finely upholstered chairs—it didn’t work, other snobs there still found her positively uncouth.
Kael was unfairly characterized as an elitist, but not inaccurately characterized as elitist. Both these statements apply to famous remarks she made after Richard Nixon won the 1972 presidential election with an earth-shaking 520 electoral votes, 49 US States, and fully 60.7% of the popular vote. My home of Massachusetts was the only member of the Union to rebuke Tricky Dick, in vain. Kael’s quote about the election was (and is) often repeated: “I can’t believe Nixon won. I don’t know anyone who voted for him.” It smacked of American liberal elitism; a signpost for how out-of-touch that cohort was with the populace in 1972. The only problem was that Pauline Kael never fucking said that. Her actual words, to an audience at the Modern Language Association a month after Nixon won, and reported by The New York Times, were these:
“I live in a rather special world. I only know one person who voted for Nixon. Where they are I don’t know. They’re outside my ken. But sometimes when I’m in a theater I can feel them.”
It wasn’t by accident that the fake quote stuck. The real quote doesn’t disabuse the reader of any notion the false one suggests. Instead, it implies that greater than 60% of Americans are not people who disagree with her, or even just misguided or uniformed, but instead that this unheard-of-in-a-generation majority (which included 36% of even the Democratic vote) were an odious presence breathing on Pauline Kael’s urbane neck in the darkness at that unfortunate moment her livelihood forced her to be within 20 feet of them. This was not a person who had great admiration for ordinary people, despite her own origins from within their ranks, and her summary of The Deer Hunter makes that plain:
[Cimino’s] new film is enraging, because, despite its ambitiousness and scale, it has no more moral intelligence than the Eastwood action pictures. Yet it’s an astonishing piece of work, an uneasy mixture of violent pulp and grandiosity, with an enraptured view of common life — poetry of the commonplace.
Pauline Kael wasn’t in politics, but her view of regular people encapsulates well why it took the Democratic Party Watergate to deliver to them the only presidential election they won in a 28-year period, and why the Republicans have lost four-and-a-half (the half being 2000) out of the last seven. Placing oneself above the fray plays well in dense magazine pages, but not in voting booths. It’s easy to remember Nixon for Watergate and Vietnam, but he also spent his entire career pandering to ordinary people, including seeking to expand Medicare and federalize Medicaid. His motives for doing so may be suspect—he was a megalomaniac so consumed with fear of losing his own power as to have his sycophants compile a large “enemies list,” the expanded version of which contained Fonda, Bill Cosby, and Joe Namath—but many politicians have ulterior reasons for their policies and not all make delivering for ordinary people a priority. Ironically, Kael was the daughter of immigrant chicken farmers, who lost their Northern California farm when she was eight. The Nixon family’s ranch failed when the former (ex) President was nine, in the Southern part of the state. This all paints a picture that makes the class-warfare themes of Heaven’s Gate easier to understand, even if Cimino was a conservative. Incidentally, Kael so hated that film that she called it “a movie you want to deface.”
The discord between Michael Cimino and Pauline Kael is actually more interesting than their politics. Kael admired many New Hollywood films, and actually rose from obscurity to her mainstream platform for lauding Bonnie and Clyde. However, despite her sometimes taste for what it produced, Kael was an early and often critic of the auteur theory. She believed film was a collaborative process, and engaged in a long-running debate with rival critic Andrew Sarris, a key proponent of the theory (though every bit Kael’s fellow traveler in the field of highbrow Deer Hunter hatred.)
Despite its 1941 release (decades before the New Hollywood), Orson Welles’ Citizen Kane is supposed to be a shining example of the auteur at work, because of the extreme creative control Welles exercised over the picture, which he starred in, directed, co-wrote, and produced. However, in her 1971 essay Raising Kane, Kael pointed out how much of that film was owed to screenwriter Herman J. Mankiewicz and cinematographer Gregg Toland, and she was right. I would have added editor Robert Wise to the list. In an irony that could have only befallen Pauline Kael, key parts of the essay were subsequently revealed by Peter Bogdanovitch in Esquire to have been plagiarized from the UCLA professor Dr. Howard Suber, whose research and own drafts Kael had gotten her hands on under the pretense that they would work together, and then stopped returning his phone calls. In the book where the essay appeared, Kael failed to credit Suber at all. In the irony to the irony, in 1998 (after Welles had been dead a safe decade), Bogdanovitch revealed that the legendary director had taken a strong hand in “revising and rewriting” the Esquire article before its publication. Despite living 30 more years, Kael never responded to the controversy, and Raising Kane is widely viewed as discredited today for the many facts it ignores.
To Cimino, Cimino was the genius. He hadn’t hesitated to try and squeeze writer Washburn out of the picture, just like Welles had Mankiewicz a generation earlier. Even Vilmos Zsigmond, who possessed a genius rarely questioned in film circles, having shot films like Close Encounters of the Third Kind, was not owed a kind word in Cimino’s eyes. “Vilmos and all those guys have built themselves up to be bigger than directors. It’s bullshit,” he told The New York Observer in 2002. Being the sophisticated creature of literary circles that she was, she surely would never have phrased it that way, but one wonders if the sentiment in Pauline Kael’s heart for Howard Suber’s contribution was any different.
But in 1979, what at least one (and probably both) of them didn’t know was that Cimino was about to do more to kill the auteur era than Kael’s printed barbs ever could.
Accounts vary about the actual making of Heaven’s Gate. Some say it was a disaster. Other contend it was a smooth, though very long process. But objectively, there’s no getting around the fact that there were problems. At the end of the first six days of shooting, Cimino was five days behind schedule, and had already spent $900,000. For that money, he only had a minute-and-a-half of usable footage. Two weeks into shooting, he was getting about a minute worth of usable movie for every million dollars he was spending. He had shot about two hours’ worth, but of that, the director would only approve just three minutes.
Even Cimino’s most loyal defenders don’t deny that his was an extremely long, meticulous process. Kris Kristofferson guards Heaven’s Gate to this day, but concedes that Cimino shot 53 takes of the actor cracking a bullwhip while laying in bed; a complex scene with many other actors that involved a complicated resetting every time. Kristofferson had to be specially trained with the implement, but as anyone versed with one will tell you, it’s not hard to hit yourself in the face while doing this, and he did. To anyone who’s seen the movie, this is surprising, because there’s nothing special about the action in Heaven’s Gate. It looks as unremarkable as anything in the most unimagined westerns of the time. Punches land out of our direct sight with an unrealistic smacking sound effect, and so on. Much of the action is on par with an episode of Gunsmoke.
But the method wasn’t limited to action sequences. Dialogue got the same treatment, or worse, but it wasn’t like the director was taking repeated cracks at achieving his very specific vision. Actor Brad Dourif, who—along with Kristofferson, Bach, and Field—appeared in the 2004 documentary Final Cut: The Making and Unmaking of Heaven’s Gate (based on Bach’s 1985 book of the same name), described it like this: ““it was like workshopping on film. we did the happy version, we did the crying version. We did the furious version.” Cimino would even spend hours selecting the right extras for a shot. Here, a callback to Hitchcock is in order—that director was famous for being meticulous, but also for meticulously planning ahead of time, down to what every shot would look like. He would appear on a film set with the whole movie already completed on paper, and the matter was simply getting it from there into a film canister. He also, as already stated, could work on what others considered a modest budget.
By the time United Artists finally stepped in three months later, Cimino has burned through 18 million dollars, and the movie was nowhere near done. He had begun with a budget of $12.5m, but it carried an implicit promise that he was working with a blank check. His contract lacked certain basic protections that would have incentivized finishing on time. Now, the studio put its foot down: 25 million and not a penny more, and if he screwed this up he would forfeit final cut, a director’s right to approve the final version of the movie. Plus, they fired Carelli, so she could no longer run interference for him. Cimino, to his credit, sped up and completed the work on time. But, as with The Deer Hunter, a scandal was brewing that would soon unfold in the pages of a newspaper, and this time, nobody at the Pentagon could stop it.
Les Gapay was a former Sacramento Bee and Wall Street Journal correspondent who had moved to Montana for a change of pace. There, he operated a cherry orchard, and kept writing on a freelance basis. In this latter capacity, he tried to get on the set of Heaven’s Gate to cover the production. Gapay was told Cimino’s set was closed to the press. Undeterred, he enrolled as an extra, and began to record his observations.
This is a classic Hollywood loophole that remains unclosed. Movies simply need people to fill the frame, and there is no time or resources to vet them. This is how I got on the set of Wolfgang Petersen’s The Perfect Storm as a sixteen year-old. This is also how, after being fired (by fax, this was 1995) as director of the preposterous The Island of Dr. Moreau, the eccentric Richard Stanley was able to infiltrate the production disguised in full costume and makeup as one of the “dog-men.”
Gapay’s account first ran in the Los Angeles Times on September 2, 1979, but his musings would soon spread nationally. According to Gapay, Heaven was a dangerous place to be: there was a multitude of injuries; people fell out of wagons, they were stepped on by horses. In the account, a fight nearly occurs between an extra and a crew member, when the latter instructs a wagon driver, “If people don’t move out of your way, run them over.” Gapay himself falls out of a wagon, and on another day is stepped on by a horse. An X-ray would reveal a crushed toe. One extra withdrew himself and his family from the production after there were 16 accidents in one day. At times, extras in closed quarters were forced to wear heavy wool garments in excessive heat and smoke used for effects, some of whom fainted. They were forced to work from 5am to 11pm, with only one meal, for which they were charged $3 in 1979 dollars. Cimino balked at observing what are completely standard union rules about breaks and meals.
There were also tremendous clashes between the movie and the National Parks Service, with the former repeatedly violating boundaries the latter set (reportedly at Cimino’s direction) and eventually being barred from Glacier National Park entirely. Among these (many) violations, Cimino broke the conditions of a shooting permit that allowed him to use one cow carcass in a scene by excessively butchering it and spreading its entrails. The Park Service was concerned about luring bears in the area. A ranger discovered three cows being butchered and their insides being spread around a tent set. But some locals criticized the Park Service for overreacting, because Heaven’s Gate was a boon to the otherwise quiet local economy. It was a complicated struggle of economic interests not unlike the plot of the movie.
But the most memorable parts of Gapay’s report are the surreal ones. At one point, a group of locals tosses Cimino into the mud, a symbolic harbinger of his impending career trajectory. In an extended version of Gapay’s account published in The Washington Post, he goes into great detail about Cimino’s craft:
In one typical scene, Cimino moves some suitcases around and orders a prop man to “give that man some glasses and put a wedding ring on that woman’s hand.” In another scene he spends several minutes having an assistant move around socks drying beside a fireplace. He remembers from one day to the next if an extra is slightly out of place. “You had a cigar yesterday,” he tells one extra in a cockfight scene.
This account contains my favorite line: “‘Cimino interviewed 300 horses for this movie,’ quips one crew member.” That would be funny, and it is, but it becomes much less so when cast in the light of a controversy that would emerge later about Heaven’s Gate: animal cruelty. Despite Cimino’s pickiness over horses, he does not appear to have valued the animals very much. This is a portion of what appears on the American Humane Society’s “Humane Hollywood” website about the film:
The animal action in the film includes an actual cockfight, several horse trips, and a horse being blown up with a rider on its back. People who worked on the set verified more animal abuse, such as chickens being decapitated and steer being bled in order to use their blood to smear on the actors instead of using stage blood.
To be clear, because those words actually minimize it: one of the accusations against Heaven’s Gate is that a living horse was actually dynamited to death, and that this footage appears in the final film. If the filmmakers wished to defend against these accusations and many others, their tools were limited to do so. They settled lawsuits by the owners of animals. The AHA had been barred from supervising the animal action on the set, and later protested the movie. It was the result of the Heaven’s Gate fiasco that the Screen Actors Guild and the Alliance of Motion Picture & Television Producers now contractually authorize AHA oversight of all animals used in filmed media. When the film is shown in England, the British Board of Film Classification mandates that these scenes be removed.
Once filming wrapped (and after traveling to England to complete the film’s complex “Harvard 1870” opening, which adds nothing to the movie), the editing process began. Cimino was contractually obligated to deliver a version of the film that ran between two and three hours. Cimino submitted a workprint to the studio which reportedly ran for 325 (five hours and twenty-five) minutes. If this length worried executives, he assured them that it was “about fifteen minutes longer than the final cut would be.” The version you can stream today is 220 minutes, and as a person who has seen it, I am shocked that there exists a version of this movie that is longer. After much fighting between Cimino and UA (during which time his firing was put back on the table), the director eventually delivered this shorter version, and a premiere date (a year late) was set for November 1980.
But it was all for Cimino’s art. Many masterpieces have had to break a few eggs. So what’s the final product like? In The New York Times, Vincent Canby wrote the words that would go down in history: “an unqualified disaster,” the critic called Cimino’s opus. The critical reception was almost universally negative. It was all the haters of The Deer Hunter forging an unlikely alliance with all of that film’s defenders, for the common cause of rejecting Cimino’s pretentious, flat, disaster-piece.
Today there is a competing historical narrative that says Heaven’s Gate was never given a fair hearing. This view holds that the bad press from Gapay and others prejudiced critics and the public so much that there was no way this film could succeed. These people claim to like it, and some believe it to be a masterwork. This is (sigh) especially true in France. Don’t believe them. If you don’t believe me, see this movie for yourself. Or at least try to. See how much of it you can get through. Right now, it’s available to on-demand from the Starz Network.
It is true that the movie is often beautiful to look at, because of Zsigmond’s cinematography. But imagine the most breathtaking view you’ve ever seen. Or even the most breathtaking five views you’ve ever seen. Now imagine being forced to look upon these views for five hours and twenty-five minutes, without much else to command your attention, and you will have a notion of the experience Michael Cimino wanted to subject filmgoers to. At three hours and forty minutes, it’s somehow not any less like this analogy.
The actors are not bad. Sam Waterston gives grand orations that presage what Law and Order fans would come to love more than a decade later. Kirstofferson performs admirably, where he can, and Jeff Bridges is interesting. Cristopher Walken is especially compelling, though he seems distinctly a creature of the 1950’s, not the 1890’s. Huppert’s English was mostly understandable to me.
But the writing is awful. The dialogue spans a spectrum oscillating between needless and pretentious, rarely finding a comfortable position in between. If the results of the WGA arbitration were not enough, this film stands as a testament to the fact that Deric Washburn was the actual genius behind The Deer Hunter, along with Zsigmond’s cinematography and the actors’ excellent work. To experience a trip through Heaven’s Gate really endorses Pauline Kael’s view of the auteur, and Cimino in particular.
Like Jaws and Star Wars, Heaven’s Gate made history. Just not the kind you want to make. The final price tag on the film was $44m. It went on to gross a mere $3.5m worldwide. It was exactly the inverse profitability of the kind of movies that started the New Hollywood, and as I said at the outset, Heaven’s Gate is regarded by many as the film that killed it.
It did kill United Artists, in a sense. The loss forced parent Transamerica to sell the studio, resulting in a merger with rival MGM, which ended the style of management there that had led to UA’s Oscar-winning streak, but also to Cimino’s wreck. For many years that followed, through many corporate restructurings and sales, the studio made mainstream fare almost exclusively. Heaven’s Gate marked the end of an era, not just for UA, but for cinema as a business. And it is a business, even if it is a business that produces much more quality art than say, the office supply manufacturing business.
For Clint Eastwood in 1974, the answer to that question was $4m. Eastwood’s Malpaso Productions trusted Cimino, who had cowritten Eastwood’s Magnum Force a year prior, to direct the actor and costar Jeff Bridges in Cimino’s first feature, Thunderbolt and Lightfoot, a buddy comedy about a bank robber (Eastwood) and a car thief (Bridges) thrown together by calamity on a meditative road trip. Despite his inexperience, Cimino completed the movie on-time and on-budget. In the wake of Easy Rider, unconventional road movies were popular at the time, and Thunderbolt was no exception. It grossed a respectable $25m and was received well by critics. It was not huge success by New (or old) Hollywood standards, but everything about it implied Cimino had a talent for much more than commercials advertising discounted airfare.
For his next trick, Cimino pulled off something remarkable. 1978’s The Deer Hunter is the story of three blue-collar friends from small-town Pennsylvania who eagerly depart for Vietnam, only to be forever changed by a horrific and traumatizing experience there. The story ends with their lives in seemingly irreparable ruin as a result. The film was a veritable all-star team of the era’s new talent, with a cast that included Robert De Niro, Christopher Walken, John Savage, Meryl Streep, and John Cazale, all photographed by one of the greatest cinematographers of all time, Vilmos Zsigmond.
Who wrote The Deer Hunter? This is not unlike asking Michael Cimino’s age. The film’s central metaphor, and the source of its biggest controversy, are the scenes of Russian roulette throughout the movie. Surprisingly, the origin of these appears to be an unproduced 1968 screenplay called The Man Who Came to Play, about a group of men who play the deadly game in Las Vegas, which had been purchased in 1968 when studio EMI began expanding from music into film. The Russian roulette element is probably the extent of the substance of The Man Who Came to Play that remains in The Deer Hunter, other than both projects concerning a group of men. The Deer Hunter that you see on screen reflects a very specific intent to make a film about the war.
Cimino would say, and he claimed this until he died, that he wrote the movie. All parties concede that writer Deric Washburn was hired to pen the project. As Cimino told it, Washburn had delivered a screenplay that was a shambles: “I just could not believe what I read. It was like it was written by somebody who was … mentally deranged. He was totally stoned on scotch, out of his frigging mind. He started crying and screaming and yelling, ‘I can’t take the pressure! I can’t take the pressure!’ He was like a big baby.”” Putting aside the credibility of Michael Cimino referring to another person as “out of his mind,” let alone a “big baby,” the only person who appears to second this account is his longtime producer and probable lover, Joann Carelli. Washburn’s version is the opposite: “It’s all nonsense. It’s lies. I didn’t have a single drink the entire time I was working on the script.” Washburn concedes he was under immense pressure to complete the screenplay, but insists that other than three days of mapping out the plot, Cimino contributed nothing to the process. To Vanity Fair in 2008, he described a dramatic scene, fitting for the Harvard-educated playwright he is, wherein Carelli delivered his walking papers over dinner: “We finished, and Joann looks at me across the table, and she says, ‘Well, Deric, it’s fuck-off time.’ I was fired. It was a classic case: you get a dummy, get him to write the goddamn thing, tell him to go fuck himself, put your name on the thing, and he’ll go away. I was so tired, I didn’t care. I’d been working 20 hours a day for a month. I got on the plane the next day, and I went back to Manhattan and my carpenter job.” For what it’s worth, the dispute went before a neutral decision-maker: the Writers Guild settled the issue, awarding Washburn sole credit for The Deer Hunter. In fairness, WGA arbitration is a process famous for excluding writers who contributed a great deal to a project in favor of limiting the number of names who get an official credit, but I have never heard a story about sole credit being given to an unfamous writer who delivered a draft that was of the sort Cimino describes. The WGA process involves reading the various drafts, so they knew what Washburn turned in, and how it may have differed from what ended up in the movie, and they chose to exclude Cimino entirely. Whatever the truth of Cimino’s contribution may be, there’s no possible way it’s anything like what he claimed it was in interviews.
Cimino’s craft as a director was, by all accounts, more than detail-oriented. But here is one example, especially pertinent to this discussion, from the Vanity Fair piece published upon Deer Hunter‘s 30th anniversary in 2008:
De Niro, as is his custom, had done meticulous research for the project, speaking to a number of veterans. Cimino gave him a wallet with the actor’s picture and his character’s name on a driver’s license along with family photos that belonged to a real veteran. According to Walken, the director also gave the cast a photo of a dozen or so children which he said held great significance for him, although he declined to reveal what that was.
All of this is odd, but odd is not unusual in filmmaking, and it was even very normal in the context of the New Hollywood. De Niro and Walken are both New York actors known for their eccentric methods of preparation, and these things very well might have helped in their craft. What is noteworthy here is the driver’s license. It signals that, as far back as 1977, Cimino was in the habit of creating fake documents, which reframes whatever you want to call the event of his showing that passport to Vanity Fair in 2002 to lie about his age. There may be other eccentric directors who have a record, in the press, of forging identity documents for reasons both professional and personal, but I know of none.
The Deer Hunter had a limited release in December of 1978 to qualify for the Academy Awards. It was immediately raved about by many critics, and just as immediately despised by others. As one of the first big movies that tried to reconcile the American experience with the Vietnam war, controversy was perhaps inevitable.
I first saw the film as a teenager. Despite a cynicism that was somehow greater at 16 than it is today at 37, I was moved by it. Being born in 1982, the only feelings I had attached to the war were based on the residual traces of it that hung over the eighties: the POW MIA movement, a visit to the Vietnam Veterans Memorial wall in Washington when I was 11, and whatever other movies I had seen up to that point, of which there were several. But while Platoon and Forrest Gump may have presented depictions of the war at opposite poles, The Deer Hunter is really of another character entirely. Because the film is separated into three distinct (very long) acts, which comprise before the characters go to war, moments from that grueling experience, and their lives afterward, the viewer is rarely spared discomfort. The first act is a leisurely, often beautiful experience where we live last joys in the characters’ lives along with them, almost in real time. That first act serves to make the subsequent two all the harder to bear, as we are presented with a series of horrors, or one horror that is so terrible it colors everything that comes after it, also in what often feels like real time. I thought The Deer Hunter was a great film, and I never wanted to watch it again. To this day, I’ve only seen it three times in my life.
I moved to Hollywood in my mid-twenties. One day, I was browsing in a dirty, crowded shop of miscellaneous memorabilia; the kind of place that exists less and less in Los Angeles as property values reach for new heights. Combing through a large box of what were little more than scraps of paper, I discovered a small poster for The Deer Hunter; what was once referred to as a “lobby card.” It had a small crease but was otherwise in excellent shape. The proprietor sold it to me for 50¢. I took it home and framed it.
When I began to read critically about The Deer Hunter, what most surprised me was the controversy the film generated when it was released. By many, it was judged to be a revision of history at odds with the agreed upon conception of the war common to the mainstream American left. What was shocking to me about this analysis was how at-odds with my experience of the movie it was: I had believed The Deer Hunter to be perhaps the most anti-war film I had ever seen. It presents an image of the war that LBJ or Richard Nixon would have never conceded to have existed, let alone endorsed. If it says something about the war, it is that the war was a nightmare that ruined the people drawn into it, even the ones who left it with their lives and limbs intact. The idea that the war was worth fighting, the main contention of its defenders to this day, is utterly off the table when this film concludes its 183-minute running time.
Despite its conclusion nearly 45 years ago, the Vietnam War remains a third rail in American discourse. I hesitate to express any opinion about it, particularly because that opinion tends to incite true believers at both poles of the subject. While popular culture likes to pretend all baby boomers were antiwar hippies, this is clearly untrue. Many were squares. Many hippies subsequently converted to squares and changed their view. Some squares are now in line with the hippie view. However, way too many people in at least the first two groups are just unwilling to confront the idea that they may have spent their youth accepting facts without reexamining them critically as history has progressed. Of the hawkish right, the most generous view, as I can discern it, is that this costly (50,318 American lives and one trillion in today’s dollars) and misguided war was undertaken mainly as a proxy fight with America’s very powerful enemies, none of them named Vietnam, and that engaging in that fight may have served to help forestall a devastating nuclear conflict on a world scale. Of their opponents on the left, the similarly generous view is that they opposed what was truly an unjust fight that devastated a nation, that it was further unjust to drop young Americans in the middle of that fight, and that the support which many of them expressed for the North Vietnamese and the Viet Cong, despite identifying as “anti-war,” was based only on ignorance of the reality that these groups were as “pro-war” as anybody sitting in the Pentagon. On all sides, the people who actually fought the war were probably people who had no other choice, because freewill is often illusory, so that even those who enlisted voluntarily were not selecting that option in a rational calculus that wavered between it and joining a peace demonstration at Berkley. And in any case, that false choice is obviously still much more of a choice than any native Vietnamese, who were inarguably set upon from all sides, had. Of those who served on all sides, most served honorably, some served terribly. Historical scholarship has produced little but a series of horrors about the conflict in the ensuing years. How many were honorable or terrible, on which side, and when? What does a pie chart of it look? I can’t tell you, and neither can anyone else, with any accuracy. Do you have an opinion about who was right and who was wrong? What would the pie chart have to look like for you to reconsider it, specifically, numerically? Most of the worst parts of the war, are things which are true of all wars.
It’s common to see World War II as just, and Vietnam as unjust, but a civilian incinerated by an allied bomb at Dresden or Hiroshima might look with envy upon the millions of Vietnamese who suffered disability or illness from exposure to Agent Orange, but nevertheless got to keep living. Of course, plenty of Vietnamese were killed just as dead by American bombs and bullets. In violence perpetrated by all sides, as many as four million Vietnamese may have been killed during the war.
Many complained The Deer Hunter was racist. This is not an example of people making something out of nothing. In the film, the white people are mostly victims, and the Asian people, or at least the ones who have lines, are mostly monsters. If you are the type of person who reads every movie as a statement about the races you see on screen, you will probably judge this movie to be racist. What’s strange is how specific this complain is to The Deer Hunter. The movie contains arguably fewer depictions of actual racism than almost any other Vietnam movie. The cliched racial slurs directed at the Vietnamese are not constantly spewing from the soldiers’ mouths. The racism is mainly evidenced by the horrors the perpetrated by the Viet Cong—which, incidentally, actually substantially toned down from some almost comically atrocious ones that appeared in earlier drafts of the script.
I accept this critique as valid. The film lacks positive portrayals of the Vietnamese, but the film is also very limited in its scope. What I don’t accept is that Coppola’s Apocalypse Now—wherein native Southeast Asians are cattle-like savages who fall under the spell of a charismatic white leader—isn’t subject to the same critique. Even more egregious is Stanley Kubrick’s Full Metal Jacket, which literally inspired a generation of racists to weaponize its portrayals against Asian women (in addition to inspiring the rappers in 2 Live Crew, and Sir Mix-a-Lot.) But those movies also depict the United States as a horrible force in a way The Deer Hunter doesn’t, other than as obliterators through bombing of a village, which I still think is pretty bad. But again, the film’s scope is pretty limited. It doesn’t attempt to tell the story of the war, but rather a personal one of characters caught in it.
This problem with The Deer Hunter highlights one of the primary problems (and there are many) filmmakers face when their work is perceived as a statement about race that they didn’t intend.
This issue is complex. Take another example, this one seemingly benign: Overlord is a 2018 horror/action movie about a group of American paratroopers who, landing in France on the eve of D-Day, encounter Nazi science run amok in the form of human beings who have been transformed into super-zombies. It’s a decent movie, certainly above average for what that description implies, particularly in its special effects and the pedigree that comes with giving J.J. Abrams a producer credit. But I noticed one thing very striking in the film’s opening moments: the paratrooper unit is depicted as containing both black and white service members. The problem is that the Armed Forces were segregated until Harry Truman signed Executive Order 9981 in July of 1948, four years after this movie takes place. Though about a million of them served in World War 2, black soldiers were treated as unequal, and denied assignments and opportunities to participate extended to white soldiers. This is the unfortunate reality of an event in American history which popular culture now chooses to remember as exclusively defined by heroism.
But what were the producers of Overlord to do? Representation of various demographics in a film like this is essential economically, but it’s also just the right thing to do at this point in history. Personally, I think the performance of black actor Jovan Adepo is central to what makes this a decent movie, and it wouldn’t be worthwhile without him. This movie did not create much of a controversy for its revising of history (it barely recouped its budget and went unnoticed by most people altogether), but nevertheless at least one outlet (Inverse) asked Adepo about it. He told them, “We’re not trying to make a historical movie,” and went on to suggest the film takes place in an “alternate universe.” This isn’t a bad answer, and I will say, the liberties taken with the historical record did not affect my ability to enjoy Overlord. But I wished they hadn’t overlooked it, and expected me to overlook it, when the writers could have just as easily crafted there story so that fate had thrown together a black unit with a white one, in order to achieve the same result. I would prefer this route because, I know, somewhere in America, a seventeen year-old is seeing this movie (and ones like it) and basing his knowledge of American history thereupon. I know this, because I was a seventeen year-old who believed things I saw in movies, if I didn’t have reason not. I suppose Overlord‘s historical rewrite would be closer to harmless if we were living in an America in which racism had been eradicated for the plague it is, but anyone who pays attention knows we’ve got a long way to go. Even in that case, it would still be wrong to erase from history the real struggles and real heroism black Americans experienced and displayed during the war, which are often cited as being one impetus for the civil rights movement.
And this all begs the question: how should the Overlord producers actually create their movie? If they had told a story exclusively about white characters, they would be arguably racist for lacking diversity. But as it stands, they’ve erased the suffering of a minority group from history, which there’s an equally good argument to be made is more harmful. But even if they took my advice, and created a story about two groups forced by a plot point to become intertwined (in this schlocky horror movie) they would risk being accused of treating issues of race with an inappropriate degree of sensitivity. So the movie exists as it is, and ultimately, and had Overlord been much more successful, I have no doubt it would have gotten much more criticism on this point.
And this is the danger of racism that exists with The Deer Hunter; that someone will see this movie and choose to form an impression of Asian people based on what they see the only Asian characters of consequence in the film do. As in my critique of Overlord, beyond the implicit (or explicit) racism of the situation, the other ground upon which Deer Hunter was challenged was its historical accuracy. In the film’s most famous sequence, the main characters have become prisoners of war, and they are forced to play Russian roulette (this being the thing borrowed from The Man Who Came to Play) by their Viet Cong captors. It is among the most tense scenes in the history of cinema. I defy anyone to watch it without having a visceral reaction. This scene sparked (it isn’t Cimino-esque hyperbole to say) a global, negative reaction. Aside from the many critics who took issue, Peter Arnett, who won the Pulitzer Prize for covering the war, said in the LA Times, “In its 20 years of war, there was not a single recorded case of Russian roulette … The central metaphor of the movie is simply a bloody lie.” Elsewhere, at the 1979 Berlin Film Festival, the Soviet delegation, joined by other socialist states, withdrew their films in protest of The Deer Hunter.
Today, the most famous Vietnam veteran in the United States is, far and away, Senator John McCain. He is more famous than he was when he died in 2018 after a protracted battle with brain cancer. McCain had been a household name since at least his bid for the Republican presidential nomination in 2000, but whether he was the nation’s most well-known veteran was often competitive with his Democratic counterpart, former Secretary of State (and Senator) John Kerry. But in the summer of 2015, McCain shot way ahead of Kerry, and he’s remained there since, even in death. Kerry might take the lead again, but for now it appears McCain’s position in history as the face of the Vietnam veteran is cemented by the events of that summer, and those that followed. This is because in July of that year, then-candidate Donald Trump made these remarks before an audience in Iowa: “He’s not a war hero,” said Trump of McCain, “He was a war hero because he was captured. I like people who weren’t captured.” The words were met with immediate, sweeping, public scorn, even by the audience Trump was addressing.
The comments exposed Trump, to the extent it wasn’t already obvious, as someone who never thought any further about Vietnam than the distance to his own allegedly bone-spurred feet. But they energized many against Donald Trump who had previously ignored him, and illustrated why McCain remains the revered figure that the current president, though he holds that office, can never be. Politically, McCain was never widely popular. He was a “maverick” in his own party, a designation which brought him positive press on occasion, but does not inspire trust in one’s colleagues in a political party. This didn’t do him any substantive favors in the other party either, with whom he was never quite aligned enough on any issue—at least until one of his final senate votes opposing Trump’s will—to secure a following. What he did have was the respect of nearly everyone in public life except Trump, and the public at large. He was, by most accounts, an honorable person. But the specific way people from all corners sprang to the McCain’s defense over Trump’s despicable remarks was due in no small part to certain details of the story which the former Reality TV personality’s paraphrasing neglected. McCain was not just a veteran. He was not just a prisoner of war. John McCain was tortured.
McCain was shot down over North Vietnam while serving as a naval aviator in 1967. After living through that, he was captured, and imprisoned in what was ironically called the “Hanoi Hilton,” and not released until 1973. During his captivity, he was subject to beatings several times a week so that his captors might extract propaganda statements from him, which he would be forced to sign under duress. Of the experience, he later wrote: “I had learned what we all learned over there: every man has his breaking point. I had reached mine.” And it should be noted that McCain, during this time, was receiving “special” treatment by the North Vietnamese, because they were aware his father was a very powerful naval commander in the war, and believed he might be a unique propaganda tool. McCain refused a release out of sequence with the men held alongside him, citing US military policy.
But McCain’s torture was not unique. According to the accounts of POWs, prisons in the war were places where such atrocities as murder, beatings, broken bones, teeth and eardrums, dislocated limbs, starvation, the serving of food contaminated with human and animal feces, and medical neglect of infections and tropical disease occurred. Torture occurred in many circumstances in the war, by parties on all sides. The Viet Cong, just as see you in the movie, did perpetrate massacres of civilians, particularly at Huế. The South Vietnamese are certainly reputed to have been as bad as their enemies, if not worse, and Americans are not without blame for that, in addition to the extensive recorded atrocities by American personnel, such as the Mỹ Lai Massacre, where US soldiers mutilated, raped, and murdered over 500 civilians, most of whom were old men, women, and children.
But what about any of this matters for the context of The Deer Hunter, and its controversial Russian roulette sequence? The particulars are different. Characters in the movie are tortured by the Viet Cong, a marginally different group than the North Vietnamese, who tortured McCain. The events occur not in a prison, but in a rural setting. The atrocity is not any of those enumerated above, but forced participation in the deadly game. And of course, the film does not portray American personnel committing horrors like My Lai. All of that said, what is the scope of the “bloody lie” that Arnett took exception to? Yes, it would appear the historical record does not include Russian roulette, but it does include the horrific torture of McCain and others. It’s also not Vietnam that you see in the movie, but rather Thailand. Is that problematic? What I’m getting at is what defenders of the movie, Roger Ebert among them, contended at the time: the artistic license taken with the material isn’t qualitatively different than the reality of what the war was. This is quite different than a biographical film that unfairly maligns a specific historical figure through the presentation of events that never occurred, or one that unreasonably lionizes such a figure by purposely excising similar events that did. Consider Arnett’s sentence again in full: “The central metaphor of the movie is simply a bloody lie.” If it were the presentation of actual events that had happened, it wouldn’t really be a metaphor, would it?
While the movie did have its eloquent defenders, Michael Cimino was not among them. He did a series of things which, if this all were to happen today, would have made the situation much worse and destroyed the movie’s reputation forever.
The trouble began before the controversy over the movie. Many people who worked on The Deer Hunter have said that they believed the material was drawn from Cimino’s own Vietnam experiences. The claim most often repeated is smaller than that, which is that Cimino was a medic attached to a Green Beret unit in Texas, but never sent to Vietnam. This is also a lie. The origin of this detailed version is an interview Cimino gave to The New York Times‘ Leticia Kent on the eve of the film’s limited release in 1978. In that interview, Cimino describes joining the army as something that happened in the wake of the 1968 Tết Offensive, as though he had rushed down to a recruiting office to take arms in service of the nation. Of course, when the time to check facts came around, Kent could not verify any part of this story, and when she contacted the film’s domestic distributor (Universal) over the discrepancy, they panicked. “He told the fucking New York Times he was a medic in the Green Berets? I know this guy. He was no more a medic in the Green Berets than I’m a rutabaga,” Thom Mount, former president of Universal recalled having said, when speaking to Vanity Fair for the 2008 piece. Apparently Mount brought the problem to Universal’s parent company, MCA, whose chairman, Lew Wasserman, was subsequently able to provide a number at the Pentagon that was passed along to the reporter. Presumably the information was confirmed, because it appeared in Kent’s story, and was endlessly repeated thereafter. Wasserman famously had strong connections in politics to both Republicans and Democrats, and was Ronald Reagan’s former agent. Eventually, the truth emerged, when former New York Times Vietnam correspondent Tom Buckley, writing for Harper‘s, was able to corroborate that Cimino had done a stint as a medic, but it was for six months in 1962, before the United States entered the war. Also, far from a Green Beret, Cimino had done his service in the Army Reserve.
Cimino denied having lied, and had his publicist say that he was going to sue Buckley. No lawsuit ever materialized, at least with Cimino as plaintiff. There would be litigation later in his career, however it would feature Cimino as the defendant. But misunderstandings about his military service, including that he had served in Vietnam, were somehow communicated to key people on The Deer Hunter. Vilmos Zsigmond told Vanity Fair in the 2008 story: “It seemed to me that he was involved with the war, that many, many of the stories in the film are biographical. But I don’t know where and how. He never really was specific about it.” Even his closest collaborator, Joann Carelli, could not clarify the truth about Cimino and the military for Vanity Fair: “It’s hard to tell with Michael. I don’t know where this comes from.”
Meanwhile, within the present timeline, in the Scorsese Cinematic Universe, it appeared to some that “Marty” was beginning to walk back his comments about the MCU. The Monday after the weekend The Irishman began to appear in a handful of theaters (before its larger release later in November on Netflix), he published an opinion piece in The New York Times entitled: I Said Marvel Movies Aren’t Cinema. Let Me Explain. And if it appeared that way, to some, it appeared wrong. They must not have read beyond the headline, which is completely literal. They were probably not the types who read editorials in the The New York Times, because they are more at home discussing Marvel movies on Twitter.
In the piece, Scorsese doubles and even triples down on his prior assertions, which, when read generously, could have been categorized as though he was simply saying “I don’t think those movies are good. Those aren’t my kind of movies, an art form I call cinema.” In the Times editorial, Scorsese expanded this to something that would be similarly paraphrased as “These movies are actually bad for society, and people who think they like them are just stupid.” That might sound sensational, but consider this actual quote:
And if you’re going to tell me that it’s simply a matter of supply and demand and giving the people what they want, I’m going to disagree. It’s a chicken-and-egg issue. If people are given only one kind of thing and endlessly sold only one kind of thing, of course they’re going to want more of that one kind of thing.
Does Martin Scorsese think this is what happened in American cinema? I ask this genuinely, because although his work is very rich in symbols and metaphors, I’ve always observed him (like in that headline) to be a very literal thinker when he speaks publicly. His commentary is thoughtful, and he doesn’t usually speak in broad exaggeration to hammer the point, the way many of us do.
His description, as quoted above, is an incorrect statement of what happened in the movie business. The MCU has only existed since 2007, but superhero movies, in their modern incarnation, date back to at least the success of Tim Burton’s first Batman in 1989, and probably to Richard Donner’s Superman in 1978, which spawned three sequels over a decade. And if you want to split hairs, superheroes (including Superman and Batman) had appeared in movie theaters before even that. It seems abrupt, the modern popularity of the MCU, but it’s been developing for Martin Scorsese’s entire life. Comic character Flash Gordon was finding success on movie screens before the director was ever born.
Scorsese makes other assertions in the piece that are compelling, and true. He cites a list of directors still working today, who meet his definition of cinema. These are Paul Thomas Anderson, Claire Denis, Spike Lee, Ari Aster, Kathryn Bigelow, and Wes Anderson. This is a somewhat eclectic mix, but none of them are outside of the mainstream, except possibly Denis, and she probably only places that far beyond it because there’s a limited market for French language films in the US. These are filmmakers who undertake their work with what is clearly an artistic vision, but nevertheless one that seems (or seemed) historically as having large commercial potential. These are (very specifically) the kind of people who are being pushed out of theaters by superhero movies.
That makes them very different from filmmakers like David Lynch or Jim Jarmusch, whose work barely had any commercial potential to begin with, though it did sometimes appear at the megaplex. Beyond them, there are cascading levels of more and more eccentric visual artists whose work never even leaves art school. In this group, if you talk to them, you can hear endless opinions that the work of the artists that Scorsese name-drops, and others like them, is utterly crass commercialism. To them, it is undifferentiated from the MCU. It should be noted that there are many eccentric artists whose own work might be very out-there, relatively speaking, but who nevertheless really enjoy mainstream artistic experiences. Inputs don’t always match outputs, and the fact that they don’t is why art is interesting to begin with, if you think about it.
But Scorsese’s argument is a pretty self-serving one. If you read that list coming from anybody else, you would wonder why someone left the name Martin Scorsese off of it. These are the directors most like himself. In the 90’s, I thought Spike Lee was the director most like Scorsese, and he probably still is, it’s just that his career will never track Scorsese’s because movies don’t have the same cultural importance they did when Scorsese was at the height of his powers.
But more than any other filmmaker, more than Marvel films even, Scorsese devotes his attention in the piece to Alfred Hitchcock. This is admirable, as Hitchcock’s trademark silhouette is ever-vanishing from the public consciousness, and dueling high-profile biopics in 2012 (Hitchcock and The Girl) didn’t appear to slow that process.
Alfred Hitchcock is perhaps the most influential filmmaker of all time. You can argue a host of others made contributions which are more essential to the form (the Lumiere Brothers, D.W. Griffith, Orson Welles, and so on), but if you survey actual directors about the artists who inspired them, his name will probably come up more often than any other. He had a groundbreaking career, and a ridiculously long one. He worked in film continuously from 1919 to 1976. He was also about as commercial as commercial gets. He made thrillers, a genre which was akin to horror today, and in fact horror today has more in common with his work than it does with what was called horror in his time. He was extremely influential on the French New Wave, which in turn inspired much of the New Hollywood, a movement that had its apex at a time when Hitchcock was still working.
He was also very famous, personally, to his audiences. He has a cameo appearance in nearly all of his films, and these are often memorable. He gave many interviews, and also served for a decade as the host of his television anthology series, Alfred Hitchcock Presents. Hitchcock was his own brand, in an era where almost nobody went to see a movie based on who the director was.
Scorsese is aware of this. “I suppose you could say that Hitchcock was his own franchise. Or that he was our franchise,” he wrote in the Times, before outlining all of the ways he sees what is happening in movies today as distinct and different from Hitchcock. Scorsese is of the view that Hitchcock’s movies were rich character experiences, which is interesting, because other writers have noted that the same character types appear over and over in the director’s films. Like Scorsese, Hitchcock worked with some stars repeatedly, most notably Jimmy Stewart and Cary Grant (four times each.) Scorsese has made five films with Leonardo DiCaprio, and The Irishman marks his ninth with De Niro.
But there are differences to this comparison. Scorsese isn’t doing what Hitchcock was doing 50 years ago. Last year, Irishman‘s budget was reported to have been $140m. This is said to have largely paid for the visual effects work that makes De Niro, Al Pacino, and Joe Pesci look decades younger. You can look at the results and judge whether the price was worth it. It’s not the most expensive movie ever made by a long shot, but it’s exceptionally high for an R-rated non-action movie. Only two other films it is competing against for Best Picture (Joker and Once Upon a Time in Hollywood) are R-rated non-action movies, and they cost $62.5m and $90m, respectively. Of the nine films nominated in total, Irishman is the most expensive by over forty million dollars. Closest to it is Ford v Ferrari at $97.6m, and that’s a PG13 movie that you can take the entire family to. Joker, for all its own controversy and starkness, is a comic book movie. Hollywood stars the absolute two biggest movie icons of their generation, and it’s a generation that hasn’t been supplanted yet, unlike De Niro’s. In fact, when Scorsese made his prior film, 2014’s The Wolf of Wall Street, with Hollywood‘s DiCaprio, it only cost $100m. Only.
Alfred Hitchcock’s last film, 1976’s Family Plot, only cost $4.5m to make, and earned $13.2m. Even adjusting for inflation, that original budget would only be about $20m today, or one seventh of what Irishman cost. And that was on the higher side for Hitchcock; Rear Window and Psycho only cost about a million apiece. They earned $36m and $50m, respectively. The highest price tag I could find on any Hitchcock film was $6m, and that was for the 1969 bomb Topaz.
Back in 1978, before the The Deer Hunter had even been released, Michael Cimino was preparing his next picture. If the New Hollywood was killed by Cimino, this movie is what a prosecutor would hold up to the jury and identify as the murder weapon: the epic western Heaven’s Gate. Unlike The Deer Hunter, it’s not disputed that Cimino wrote this film. He had actually penned it in 1971, before even Thunderbolt and Lightfoot. At that time it was called The Johnson County War, and was based on a historical event of the same name; an often mythologized clash between settlers and wealthy ranchers in Wyoming at the end of the 19th century. Cimino had been forced to sit on it for the prior seven years after initially failing to attract a big star.
History-wise, Cimino’s script, near as my research can indicate (and I’m no historian) gets the broad strokes right. In Wyoming, between 1890 and 1893, there were violent clashes between homesteaders and wealthy ranchers that included assassins hired to kill purported cattle thieves; ultimately those wealthy interests benefited from political power up to and including then-President Benjamin Harrison. In this way, it is like The Deer Hunter; this specific thing did not happen to these specific people, but nothing happens in this movie that is so far off the mark that it should be characterized as a distortion. Heaven’s Gate is a little different, in that it borrows the names of actual people from the real story, and crafts from whole cloth events that did not occur in their lives, in some cases perhaps maligning them. Other aspects, like the presentation of the homesteader group as composed mainly of hordes of Eastern European immigrants, are false, but there were such immigrants who did participate in settling the Western United States, so this misstatement of fact might be called a blending. In any case, the historicity did not create a controversy when this film was released. There was already way, way too much other controversy to fill up column inches in America’s newspapers.
Interestingly, if people took issue with Heaven’s Gate‘s politics, it was for believing the story is some kind of Marxist fairy tale. True enough, it’s a story of class warfare, where the villains are a mustached upper-crust led by Sam Waterston, who, if he does not literally twirl his mustache, the viewer is nevertheless constantly waiting for him to do so. This wouldn’t be especially noteworthy, because many Hollywood movies tell this kind of story, if it weren’t coming from a director fresh off releasing a war movie that many thought might as well have been written by Henry Kissinger. Cimino took exception to people who tried to see any politics in his work, and these contradictory cinematic statements seem to show in his favor, in that regard. Speaking about popular interpretations of his career in one of the last interviews before he died, he said this:
“…first film, I was homophobic, second film, I was a right-wing fascist, third film, I was a left-wing racist, this and that, left-wing Marxist, and fourth film, I was a racist. So they couldn’t make up their mind what I was.“
But clearly these speculations about his work were something that mattered to him. The question by The Hollywood Reporter‘s Seth Abramovitch that sparked that very detailed response was a mere “What about critics? Do their opinions matter?” It’s interesting that he forgot “right-wing racist” about the second film (Deer Hunter), but it’s Michael Cimino. He gets the broad strokes right. At the end of his life, at least, Cimino appears to have been a conservative of the Fox News flavor: elsewhere in the same interview he begins a rant with that goes from the New England Patriots Deflategate scandal, to accused Patriot murderer Aaron Hernandez, to Hillary Clinton and Benghazi (he disapproves), to President Obama and Afghanistan (he’d rather not say what he thinks), at which point he says the “direction of [Abramovitch’s] exploration is taking [him] for a sour turn.” What did Abramovitch say to probe Cimino’s thoughts on these topics? The original question was “Have you become more reclusive?” Michael Cimino may be hard to figure out, but it’s not for lack of sharing his thoughts with you. At least if you can believe them, and again, that’s a big if.
In a week’s time, the luminaries of show business will congregate at the Kodak Theatre in Hollywood to learn who the big winners are at this year’s Oscars. This year, like every year, the biggest of the big will be the award for Best Picture. While the field is unusually crowded with popular films, and there is no obvious winner (to either take the prize or be upset), there is a certain loser: The Irishman. It will not win the award, nor will Martin Scorsese win Best Director for it.
This is not unusual. Scorsese typically loses at the Oscars, and the awards for Best Picture and Director which he secured for 2006’s The Departed were almost certainly conciliatory for so many past losses—studio Warner Bros. barely campaigned for him. By that point, his talent was undeniable and unceasing, and the years had made it difficult to justify that Raging Bull had been defeated in both these categories by Robert Redford’s Ordinary People in 1981, or that a decade after that, Scorsese’s Goodfellas had lost to Kevin Costner (yes, as director) and Dances with Wolves (yes, this happened.) The Departed might be the weakest crime movie Scorsese ever made, with an impossible-to-ignore deficit in its third act, and a Jack Nicolson performance where he couldn’t decide whether he speaks like he’s from South Boston, or Tarzana, depending on the scene. It’s a good movie if you like Scorsese’s work in this vein or the actors, but it has its holes. As a Massachusetts native, this is difficult for me to concede. As one who elevates Scorsese above any other director living or dead (Casino is my favorite movie, bar none), it’s harder still. But as time passes I’m increasingly sure The Departed will not warrant a mention in the director’s obituary, unless it’s in the same sentence that mentions his receiving these two awards. That creates a problem for this year: there are definitely people still living and voting who don’t see the need to reward Scorsese a second time out of guilt.
But this isn’t why he’s going to lose. Ordinarily, I would just say this makes it likely he’s going to lose. Instead, I’m certain of it this year. This is because a short time ago, Scorsese elected to tell the motion picture industry, in so many words, to go fuck itself. In fact, had he chosen those specific words instead of what he actually said, I am sure his chances would be better. This is an industry (and an awards show) where roguish mavericks are well-regarded. Traitors, on the other hand, are a different story.
This is what Scorsese told Empire magazine upon The Irishman’s release:
The only time his ardour [for film] dims is when the subject of Marvel comes up. “I don’t see them,” he says of the MCU. “I tried, you know? But that’s not cinema. Honestly, the closest I can think of them, as well-made as they are, with actors doing the best they can under the circumstances, is theme parks. It isn’t the cinema of human beings trying to convey emotional, psychological experiences to another human being.”
Why does that sound like “fuck you?” Several reasons. For one, it’s frowned upon and petty to criticize anyone else’s movies. You will almost never hear it publicly. More importantly, Scorsese chose to direct his ire at the biggest thing going in movies right now, and something unlike anything that’s ever happened in popular films before.
The “MCU,” if you’re not hip, refers to the Marvel Cinematic Universe, a series of 23 films since 2007 (based on Marvel Comics characters) that all take place in the same “universe” (they share characters and events.) None has lost money. Almost all are wildly successful, and last year’s culmination event Avengers: Endgame is probably the most successful motion picture of all time. This is the realization of a master plan by producer Kevin Feige and Disney after the latter’s 2009 purchase of Marvel Comics. This is completely unprecedented. The closest thing is probably the Harry Potter series, which earned $7.7b through eight films over a ten-year period between 2001-07. The MCU has earned over $22b in just thirteen years. The per movie average revenue might not seem wildly different, but it is, for two reasons: 1) because the MCU is so vast (most movies are focused on different characters) Disney can focus its development resources many of them at once; and 2) Harry Potter had a definite ceiling: the seven novels J.K. Rowling wrote. Harry Potter will be return someday in reboot form, probably sooner than anyone suspects (unless you see the Fantastic Beasts series as already counting), but any way you measure it, Rowling’s catalog is limited. Marvel Comics (in its current form) has been publishing since 1961, and has created so many characters in that time that, if Disney wanted, this film series could probably continue for the next century without covering the same ground twice. The public’s appetite for this stuff will sour (it always does) but when it does the precedent of a large, interrelated brand play that spans many groups of talent will remain. Disney’s other big purchase in the last decade, Star Wars, is already being capitalized in this same way. Warner Bros. would love to see the DC Comics universe achieve similar popularity, though their success so far has been mixed.
Disney would be a powerful enemy to have in any year, but they don’t control awards voting. Their ability to influence a vote by the people they hire is probably less than General Motors has over their members of the United Auto Workers union. This is because those auto workers, though unionized and independent in representing their own interests, nevertheless rely on GM exclusively for their livelihood. If something bad for the company happens and the factory closes, they lose their jobs. In film production, most work is done on a contract basis and continued employment is almost never guaranteed beyond that contract’s length. Most people will work for a broad swath of studios and other production entities during their careers. But there are exceptions. TV is a good job. If a show is successful and continues to get renewed, so will the employment. In film, a franchise is the closest thing to steady work, and even franchises that never take off probably had contract provisions binding the actors for possible future installments. But even Disney, the biggest studio by orders of magnitude, couldn’t marshal the awards voting necessary to change an outcome.
But Scorsese’s words didn’t just alienate Disney. They alienated an industry. The MCU, at least financially, is good for this industry. Consider the idea of steady employment, just from the perspective of actors. Take Tom Holland, the latest actor (and the first in the MCU) to play Spider-Man. He has only appeared in these movies since 2016. Nevertheless, he has appeared in five of them, three of which are not Spider-Man movies. Achieving the status of MCU character can be solid work. There is no analog in the Scorsese Cinematic Universe. To be in five of those movies, you literally have to be Robert De Niro or Leonardo DiCaprio, and even then, you can be certain it won’t happen in four years. The Irishman marks Harvey Keitel’s sixth appearance in a Scorsese film, but he started with 1967’s Who’s That Knocking at My Door? A lot of young actors may admire Harvey Keitel for his unique presence and prolific screen credits, but far fewer aspire to match his career path.
For “below the line” crew, the MCU doesn’t offer much career security. Below the line people are anyone who is not an actor, producer, director or screenwriter. If the next Avengers movie has its lights arranged and hung by a whole different team than the previous one, it probably won’t have too much of an effect on whether people pay to go see it, so Disney doesn’t have much of an incentive to preserve continuity of personnel in these positions. But any time more movies (or TV) are being made, it’s good for people who work in these jobs. In 2006, the last year before the MCU began, only one movie (Pirates of the Caribbean: Dead Man’s Chest) made over a billion dollars worldwide. In 2019, nine movies pulled off this feat, with fully a third of them being products of the MCU. None of the nine were original ideas; all were based on characters from prior movies. The total worldwide box office take was a record $42.5b. While this is almost certainly near a peak for theaters, because other forms of media are going to increasingly diminish the share of people’s time theatrical exhibition consumes, it won’t change much for people who work in film production. People might watch it at home, or (sigh) on their phones, but the product they will be consuming will be categorized as what we call feature films, at least for the foreseeable future. The more money that can be made in film, the more films will be made. More work is better for everyone.
Even the many failing attempts at miming the MCU will mean paychecks for thousands of people who work below the line. This is a situation where below the line personnel have an advantage over their creative, above the line counterparts: a flop can hurt the career of a star, or a director, and people who control the money will take that flop as a statement by the public about the commercial viability of that artist’s work. On the contrary, a key grip who works on that flop and performs their job well can add it to a solid portfolio of work in aid of their professional reputation, just like they would any hit.
Marvel movies are just one reason why we’re currently living in an age of unprecedented filmed content creation. The explosion of streaming has meant increased production in nearly all genres and formats. But Marvel’s unique success as feature films is what counts for purposes of this discussion, because feature films, exhibited in theaters, are what the Academy Awards recognizes. Even the content from streamers like Netflix that has now become a staple of the Oscars has to make at least a token appearance in a theater to qualify as eligible for the Oscars.
Meanwhile, in the Scorsese Cinematic Universe, the director’s comments to Empire were met with gasps, then anger, albeit tempered by the Hollywood manners Scorsese himself had failed to observe in his original remarks. To refute Scorsese’s take, frequent Marvelite Joss Whedon pointed to the work of his colleague, Guardians of the Galaxy director James Gunn. Despite disclaiming with, “I revere Marty,” (Whedon used the popular abbreviation of Scorsese’s first name no doubt to convey to Twitter that they are pals who might jump into a DeLorean together at any moment), the Avengers director said that Gunn packed his franchise with his “heart & guts,” countering Scorsese’s assertion that the MCU doesn’t convey emotional, psychological experiences.
Gunn was famously fired by Disney from the third installment of that franchise for comments in poor taste made many years earlier on Twitter, which seemed only a little more baffling than the company’s abrupt decision to rehire him eight months later. This time on Twitter, re: Scorsese, Gunn explained he had been “outraged” by picketing of Marty’s The Last Temptation of Christ from people who hadn’t even seen the movie, drawing a parallel to Scorsese’s uninformed commentary on Marvel. This response has many flaws, but here are three: 1) It assumes Martin Scorsese might like Guardians of the Galaxy if he saw it, which, no matter the film’s arguably objective qualities, is unlikely; 2) it equates the MCU’s defenders with religious activists who have zealotry enough to picket a Hollywood movie, and I doubt either group would embrace this comparison; and 3) James Gunn is apparently the type of person who gets “outraged” by other people’s outrage, a concept of living which is polluting our society right now, as exemplified by the snooping right-wing trolls who caused Gunn’s 2018 firing, in response to his being a vocal critic on Twitter of President Trump. While I think, for the reasons stated, Gunn’s response was misguided, it was very diplomatic. Nearly of the MCU’s response to Scorsese was. MCU producer Feige’s response was typical: “He has a great opinion, he’s a genius at what he does so he is obviously not wrong. But it is an opinion and he’s not completely right either.”
The harshest—if you could call them that—words I saw come out of the MCU camp were from Disney CEO and alleged supervillain Bob Iger: “I don’t think he’s ever seen a Marvel film,” Iger told the BBC, “Anyone who’s seen a Marvel film could not in all truth make that statement.” Any Marvel film? All 23 are that good? Obviously, I disagree with the latter portion of Iger’s quote, but when one commands Disney’s imperial forces, one is allowed to make such blanket statements, I guess. Here, Gunn’s firing comes to mind again. If the two were on roughly the same page about Scorsese’s words, I believe it’s a safe assumption that they were not simpatico on the day that regrettable dismissal occurred. During the strange valley between firing and rehiring, Iger told The Hollywood Reporter the plan to fire Gunn was presented to him “as a unanimous decision of a variety of executives at the studio and I supported it,” and “I haven’t second-guessed their decision.”
This is something key. I don’t present Disney’s firing of Gunn merely to gossip about an embarrassing situation that appears now resolved in the eyes of all parties who matter—even if, yes, it is still a little fun to do that. It’s relevant here because Martin Scorsese began his career as part of a generation of filmmakers whom it’s nearly impossible to imagine this happening to. Granted, nobody in the 1970’s was getting fired over something they tweeted, but Scorsese’s place and time was unique even in that context. Scorsese was a member of the “New Hollywood,” or “American New Wave,” or “Hollywood Renaissance.” Whatever your preference, all three terms refer to an era that began in the late 1960’s and ended a decade or fifteen years later. There’s reason to debate that last part, which I’ll get to.
The New Hollywood stood for, more than any other single idea, the “auteur theory” of filmmaking, a concept mostly rooted in French film criticism which holds that the director is the artist on a film set, and the movie is his (and it was typically a he.) Basically, to be of maximal artistic value, this theory prescribes that the power of a director must be indulged as absolute. The opposing view, that film is a collaborative creative project, is rejected by the auteur theory. Whatever other artisans are necessary, their contribution is only as good as their execution of the director’s vision.
This last sentiment was never true, even in the 1970’s. All of the great films of the New Hollywood feature some kind of dynamic artist in a position other director. These were mostly cinematographers, but very often actors and writers. There are also great films of the period from directors who were very possibly just lucky enough to end up working with those people, and failed to ever to make interesting films again. But, whatever the reality, directors in American movies never had so much control as they did during the New Hollywood’s reign.
The birth of the New Hollywood is typically seen as having occurred with three groundbreaking mainstream films: The Graduate and Bonnie and Clyde in 1967, then Easy Rider in 1969. There are others that matter which predate these, but if you take an introductory film course that covers the topic, I promise these are the ones you will hear about. These films did things with cameras and editing that hadn’t been seen outside of the avant-garde before. They were “young” movies, and all rang very true with the counterculture of the day. They were all unusually cynical for Hollywood fare, with none ending on a happy note.
And they all made a ton of money. None earned less than $60m at the box office (this low mark being Easy Rider), and none cost more to produce than $3m (this high mark being The Graduate.) To put that success in some perspective, for Avengers: Endgame (remember, probably the most profitable movie ever) to provide a return-on-investment proportional to that which Easy Rider provided (150 times its original budget of $400,000), Avengers would have to earn at the box office not the $2.8b it generated there in 2019, but instead fully $420b, or put another way, an amount about ten times what all movies on the planet generated in ticket sales in 2019. It may be counter intuitive to think of Easy Rider as the more financially ideal of the two things, but if you were a business person, and things were as simple as this, you would choose to be in the Easy Rider business any day of the week.
Thus, artsy as it was, the Old Hollywood, ever desirous of a financial windfall, wasn’t standing too much in the way of the New Hollywood. Even the most artless philistine would want to get on that groovy gravy train, if they valued money. It was a good time to be a director, or a person who wrote that director’s checks. While the New Hollywood is a term that encompasses many films of the period, some of which are bad, and some of which lost money, the ones you will hear about over and over again are the ones that are both great and profitable: TheGodfatherI and II, Chinatown, The Exorcist, Butch Cassidy and The Sundance Kid, Dog Day Afternoon, and many others. All of these movies, despite their commercial success, came from directors who had at least one foot in the art world.
It was a good time to be Martin Scorsese. The aforementioned Who’s That Knocking at My Door (1967) is classed with the New Hollywood, and although it wasn’t a big hit, the critics did notice it. It was the director’s first film, and while his career had a slow build over a handful of lower budget films, by the mid-70’s he had established himself as the creator of a truly transcendent film experience, with Taxi Driver (1976.) That movie is remarkable for several reasons, chief among them the talent and craft of Scorsese and everyone else involved. But it’s also remarkable as an example of the New Hollywood at its height.
Yes, Taxi Driver only cost $1.9m to make, but put that in context: this is a movie about an imbalanced cab driver (Robert De Niro) who scores a date with a girl way out of his league (Cybill Shepherd), so he takes her to a porno movie without warning her. When she’s offended and doesn’t want to see him again, his reaction is formulate a meticulous plan to assassinate her boss, a Senator who is running for President. When that plan doesn’t work, he instead (in one of the goriest killings ever committed to film outside of the horror genre) murders the pimp of a 12 year-old prostitute (Jodie Foster) who, by the way, he’s managed to spend way too much of this movie talking to. Today we’d say creeping on. You couldn’t make this movie in the 50’s. You couldn’t make this movie today. You could only make this movie, this way, in the New Hollywood.
And arguably, 1976 was the time society was most going to be shocked by Taxi Driver. This was still the age of shocking public assassinations. Bobby Kennedy hadn’t been dead ten years, and less than six months before Warner Bros. dumped Taxi Driver into a February release, two women had attempted to assassinate then-President Gerald Ford in two separate incidents, both in California, home of show business. One of those would-be Oswalds was the Manson Family’s own Jodie Foster-esque ingenue, Lynette “Squeaky” Fromme, and the grotesque killings that had already made her famous were themselves less than a decade in the past. Suffice it to say, this was probably not a movie America was writing fan letters to Warner Bros. asking to see. This would be the end of the story, except that, five years after the film’s release, in 1981, imbalanced Taxi Driver-fan (or, imbalanced fan of imbalanced taxi drivers, which would describe him just as accurately) John Hinckley would shoot and very nearly kill then-President Ronald Reagan, all in an attempt to impress the actual Jodie Foster, after having become obsessed with the Scorsese’s film. Despite what came before and whatever about it inspired Hinckley, Taxi Driver still earned about $28m at the box office, which still makes it substantially better in the return-on-investment department than anything called The Avengers. The New Hollywood was a weird time (or place, which would describe it just as accurately.)
In literal Hollywood, or at least its creative community, the New Hollywood was a big hit too. Every Best Picture Academy Award during the 1970’s went to a film that was part of this movement, with the exception I draw being 1970’s Patton, but I should note that Wikipedia’s list of New Hollywood films disagrees with me on this point. The Academy even awarded this trophy to Francis Ford Coppola’s The Godfather Part II in 1975 despite having given the same award to the first one, two years earlier. Besides the back-to-backness of it, how many sequels can you think of that won Best Picture? The answer is one, The Lord of the Rings: The Return of the King in 2004, and had either of its two forerunners taken the award you can be sure King would not have. The only halfway memorable challengers that film had to surmount were Lost in Translation (incidentally from Coppola’s daughter, Sofia) and Mystic River. Godfather II had to defeat Chinatown and Coppola’s own The Conversation. This is true of many of those New Hollywood Oscars: in retrospect, the winner isn’t surprising, until you realize how stiff the competition was. In 1972, The French Connection beat A Clockwork Orange and The Last Picture Show. In 1974, The Sting beat American Graffiti and The Exorcist. And in 1976, One Flew Over The Cuckoo’s Nest beat Barry Lyndon, Dog Day Afternoon, Jaws, and Nashville. All of these unrewarded challengers are regarded as unmitigated classics today. But of course, in 1977, Taxi Driver lost to Rocky (so did Network), beginning a long trend of losses for Scorsese that would only end with The Departed.
But generally, and this is especially true if you study it in college, people believe the New Hollywood did end. The popular mythology is that it was done in by a pair of assassins named Steven Spielberg and George Lucas. This is because Jaws and Star Wars (the first one, which Lucas later rebranded Star Wars: A New Hope) rewrote the way we conceive of a movie’s success. Before those movies, the public didn’t care about box office receipts. Gone With the Wind was the most successful movie of all time, but most people didn’t know that, and those who did mostly couldn’t tell you how much money it had earned. They just knew it was something extremely popular, which they did or did not like themselves. Spielberg and Lucas set records with those films, not only for how much money they grossed in total, but for how quickly, specifically for what they earned in their opening weekends. Jaws and Star Wars are the whole reason why, to this day, any halfway comprehensive look at Monday morning news will include a story about what won and lost at the box office, quantified in millions. Prior to that, a movie’s distribution often started with a slow rollout, expanding to more and more cities and slowly building to its ultimate financial take. This model, the New Hollywood’s defenders will often tell you, much more suited movies like The Godfather, which is totally commercial, but the quality of which is better promoted by things like word of mouth, as opposed to the major marketing blitzes we associate with blockbusters today. In 2020, if a movie has a bad opening weekend, the exhibitors want to discharge it from their theaters as quickly as they can. There’s no building slow to anything, even failure.
There are several problems with this theory. For one, Jaws (1975) and Star Wars (1977) are undeniably a part of the New Hollywood, and could no more have been made the way they were by the people who did in another age than Taxi Driver. They were merely drastically more successful than their peer. Lucas and Spielberg both made artier fare before these films, and if either movie had somehow failed commercially, they would be taught in film schools today as among the best of the New Hollywood. As it is, they are hardly taught at all, because pretty much everyone arriving on some campus to study film has already seen them. They lack the sexiness of obscure, failed genius that so arouses the lowercase-A academy. Another problem is that distinctly New Hollywood movies, even smaller ones, kept being made after Star Wars smashed conventional wisdom to bits in 1977. Legendary lecherous lothario Warren Beatty’s Casanova-communism adventure Reds didn’t come out until 1981, when it presumably served as a more constructive statement of displeasure with the Reagan administration than Hinckley’s had, earlier that same year. The aforementioned Wikipedia article identifies New Hollywood films as late as Coppola’s greaser adaptation Rumble Fish in 1983—but curiously leaves off the list Coppola’s The Outsiders, also released that year and also based on one of writer S.E. Hinton’s young adult novels, and with which Rumble Fish shares talent both behind and in front of the camera. Presumably the added presence of Tom Cruise, Leif Garret, and Ralph Macchio made The Outsiders just too unserious of a business for that Wikipedian’s sensibilities. But Rumble Fish came out in October of 1983, a time when it would have almost certainly been sharing space in some theaters with Lucas’ third and final original Star Wars film, Return of the Jedi, and more than a year after Spielberg had shattered even Lucas’ Star Wars box office record (as it stood then) with E.T. If the New Hollywood had concluded by that point, one assumes nobody told Francis Ford Coppola, or at the very least, they didn’t stop him from spending $10m each to create those young adult adaptations. That’s not cheap. This is about how much money Jaws and Star Wars were made for, six and eight years earlier, respectively. But on the other hand, in the film world of 1983, ten million dollars might not have seemed like an especially high price to put on a small movie about a bunch of teenagers, because of something that happened in the interim.
This is because of the third, final, and most important reason why Steven Spielberg and George Lucas have less than the blood of the New Hollywood on their hands. That reason is Michael Cimino.
If you’re under 40, there is a good chance you have never heard of this man, even if you’ve recognized the names of every other filmmaker I’ve mentioned up to this point. Today, Michael Cimino is almost completely forgotten, and that’s just how a lot of people would prefer it, possibly including Cimino himself, who gave barely any interviews to American journalists during the last two decades of his life. He spoke more often to French outlets, but many infamous Americans do.
Cimino was a director of the New Hollywood who made three movies between 1974 and 1981. The first was entertaining, the second was among the absolute best the era produced, and the third, much more than Jaws or Star Wars, single-handedly obliterated the auteur theory among the money-people. Thereafter, forever gone was the notion of directors as unquestionable gods, who must be left alone so that they might rain money on their worshiping congregants in studio offices.
Cimino didn’t come from film school, the way Scorsese, Coppola, Spielberg, and Lucas all did. His talent, near as anyone can tell, was forged on Madison Avenue, where he directed commercials for American companies like Kool Cigarettes, United Airlines, and Pepsi. However, his education was in the fine arts, receiving his BA from Michigan State in 1959, then a BFA in 1961 and an MFA in 1963, both from Yale, both in painting, it would appear. If near-as-anyone-can-tell and it-would-appear sound cryptic, they’re just disclaimers. Because as much as he was a product of the Ivy League and a pitchman for Kodak Film, Michael Cimino was also an inveterate liar about his own background. The facts I have outlined here are just the ones to which others, and public records, are reported to stipulate. Cimino offered a much more detailed biography when he gave interviews, much of it subsequently disproven, and plenty of it subject to revising by the director himself in other, subsequent interviews.
Speaking at length to Vanity Fair in 2002, after a long period of seclusion, Cimino asserted many things, these words among them: “I was a child prodigy. Like Michelangelo, who could draw a perfect circle at age five. I was extremely gifted. I could paint a perfect portrait of someone at age five.” This would be very impressive, if true, and I cannot say for a fact it is false. This is the kind of thing a celebrity can almost always get away with lying about, because how could it be disproved? A journalist would literally need to find someone who can remember that Michael Cimino was shitty painter at age five, who drew poor circles, and this witness further would need to go on the record with their claims, and even supposing you could find a person who has any memory at all of five year-old Cimino’s painting ability, it would merely be their word against his. But of course, if you wanted to do all that, you would have to start with a simpler question: when was Michael Cimino born? In the same interview, Cimino alluded to the controversy surrounding his age:
“Let’s put an end to all this once and for all,” [Cimino] says at the [restaurant], where he pulls out what looks to be a photocopy of his passport. It shows his birth date as February 3, 1952.
Thus, Cimino is trying to prove, using documentary evidence, that he is about 50 years old when this interview was conducted. Unlike the claim about his painting prowess, this isn’t difficult to verify for the reporter, Steve Garbarino, who concludes that portion of the interview with the line, “Public records, however, put his birth year at 1939, making him 63.” This might not seem like a big deal. A lot of people lie about their age, particularly in the show business of Cimino’s era. But this is 2002. He’s heard of the internet. But he can be forgiven this indulgence, right? 13 years is a stretch for anyone, sure, but surely it’s not the Avengers: Endgame of lies about one’s age; the record must be must more substantial. How many people though, are willing to forge documents in this endeavor? When you wrap your head around the baldness of Cimino’s falsifiable white lie here, coupled with the fact that he apparently arrived at this interview with a well-prepared plan to tell it, you get the picture of the kind of character we’re dealing with.
But if these things about Cimino’s personality were obvious in the early 70’s, they weren’t going to stop him. He came to Hollywood as a writer, but he had career ahead of him so enormous it would eclipse the work he put on film itself.
In an eight-month period between October 1997 and June 1998, a hitherto obscure filmmaker emerged to release two mainstream movies about technology and the future. His name was Andrew Niccol, and if it wasn’t yet a household one, his sudden success and unique talent seemed to imply it would be on the lips of the culture for years to come.
Gattaca, which Niccol wrote and directed, was released in October to critical votes of confidence and a disappointing box office, albeit one that couldn’t have surprised many keen observers of show business. In that era, star power was essential to opening a traditional studio movie. Ethan Hawke and Uma Thurman, while by that time having earned the status of movie stars, were nevertheless not the type of stars whose names could automatically fill seats in theaters. There was success to be had for movies without big stars in the mid 90’s; what was (often inaccurately) termed “indie” film, but the main determinant of that success was an arthouse credibility unavailable to a $36 million-dollar, mid-budget science fiction story like Gattaca was. In practical terms, art credibility in widely-released films was primarily ordained via purchase by Harvey Weinstein. This is why “indie” movies like Pulp Fiction and Good Will Hunting were granted underdog nobility despite featuring the stars of Saturday Night Fever and Mork & Mindy. Gattaca was simply never going to fit this mold.
If Gattaca earning back a mere third of its budget was a defeat for Niccol, it couldn’t have hurt too bad. The release of his next movie was already imminent, and it had easily enough firepower necessary to succeed on all fronts. The Truman Show, which Niccol wrote and Peter Weir (Dead Poets Society) directed, came out in June to immediate praise both in the voices of critics and in the handing over of stacks of money by the moviegoing public. This was the film that introduced Jim Carrey as an actor in serious movies. When it came time for award nominations, the film proved a contender there too, coming within striking distance of three Oscars and winning a handful of smaller awards.
Both Gattaca and The Truman Show take place in dystopias created by modern technology. Gattaca is about a future where genetically engineered human beings are a reality, and a class structure exists which oppresses people who were conceived naturally, branding them incapable and inferior. The Truman Show is about a world wherein a man in his 30’s (legally adopted by a corporation at birth) is unknowingly trapped in a dome so gigantic as to be visible from space, his every waking (and sleeping) hour broadcast to the outside world through thousands of hidden cameras, and all of the people in his life secretly actors. Both of these movies take issues that were on the minds of people in the 90’s—the immediate cracking of the human genome, and the ubiquity of media and camera surveillance—and creatively imagines what terrifying futures lay ahead if they are carried to their logical endpoint. As a teenager, I was blown away by both films, and more excited about Niccol than any of the other newly-minted filmic geniuses of that period. What’s more, I (and many others) were convinced that these stories were prescient, and were teaching important lessons about the future we would all soon have to contend with. Two decades later, it is evident I was wrong on all counts. Movies don’t predict the future, and neither can I.
How able are we ever to see what the future holds? How do we know the things that will matter tomorrow are the ones we are choosing to concern ourselves with today? Try this experiment. Pick a day in recent history. One at least ten years in the past. Try your birthday. Then go to a library (or the internet) and find the front page of a newspaper from that date. Look at the stories that appear there, then ask yourself two questions. 1) What were the major events in history that transpired between then and now? and 2) What evidence is there on the page that anyone was contemplating those events in a meaningful way, before they transpired? The answer to the first question will vary depending on which date you choose, but the answer to the second question will almost invariably be that the front pages of newspapers do not forewarn of what will be written in the books of history.
I was born October 29, 1982. My adult life has been defined by two historical events. The first is the attacks of September 11, 2001, and the second is the 2008 Financial Crisis. I was 18 years old on 9/11, and approximately 25 when the economy fell apart. When I recently examined a New York Times from Friday, October 29, 1999 (20 years ago), I was surprised by the utter lack of reporting about the issues that would underpin world-changing events within a decade. Just kidding. It met completely with my expectations, because it was a totally contrived exercise meant to allow me to write the preceding paragraph. But consider the point anyway. That day, the Times’ front page was shared by several stories then germane to our national discourse: assisted suicide in Oregon; the prospect of across-the-board cuts to the federal budget; the meeting of five Republican presidential hopefuls not named George Bush. The sole photograph above the fold is actually about the Falun Gong religious sect and their plight under the Chinese Government. None of these things are in the news today. But with the exception of Bush’s inevitability as the nominee, they didn’t really come to any conclusion either. They simply petered out in the national conversation, displaced by things of greater interest. If there’s a point to this exercise, it is that there is very little predictive value in what we consider important enough to talk about in the present.
While terrorism and the economy were written about (the former sometimes, the latter often) in the New York Times (and the media as a whole) over the years leading up to 2001 and 2008, there was a decided shortfall of handwringing about the potential danger of the two aforementioned calamities occurring, right up until the point at which their impact was undeniable. That isn’t to say that nobody predicted these things. In the wake of both were born celebrities whose exclusive claim to relevance was such prognosticating. For 9/11, it was former national security official Richard A. Clarke, who went from being known (if at all) before the tragedy as someone difficult to work with, to today being the author of eleven books (which include four novels, as well as 2017’s earnestly-titled Warnings: Finding Cassandras to Stop Catastrophes). In the Financial Crisis, it was chiefly the investors profiled in Michael Lewis’ The Big Short, who are probably more famous because of Adam McKay’s star-studded film version. In both instances, these were people who held firm to an unpopular view at their peril. In retrospect, they seem like nothing short of reluctant geniuses for their foresight.
In the aftermath of Gattaca and The Truman Show, I expected Andrew Niccol would eventually be adjudged a genius on par with those predictors. It might take a long time. It might take my whole lifetime. But it was going to happen. Over the years, those narratives would come to seem increasingly prescient and these films would to be regarded something like what George Orwell’s literature was in my youth: a warning about the future which, while never coming to literal truth, more and more is seen as having predicted the issues which the world many decades ahead of its creation would contend with.
In the short term, the evidence began to mount. At a celebratory White House press conference characteristic of the leaps and bounds that all of technology seemed to be taking at the time, President Clinton announced the Human Genome Project complete in June of 2000. In his remarks, the President compared the mapping of the human genome to the “frontier of the continent and of the imagination” that Thomas Jefferson had set out on a map in the same room two centuries earlier, in the wake of the Louisiana Purchase. “It is now conceivable,” said Clinton, “that our children’s children will know the term ‘cancer’ only as a constellation of stars.”
In Gattaca’s near future, Hawke’s Vincent Freeman is especially occupied with knowing the constellations of stars. His dream is to go to space as a celestial navigator for the Gattaca Corporation, an entity something like SpaceX, if it had the corporate culture of IBM in the 1950’s. But Vincent’s experience in the genetically enlightened future is not what President Clinton was picturing. Vincent was born a “god child,” as Thurman’s Irene Cassini phrases it in a revealing bit of pejorative slang. He was not genetically engineered, and therefore imperfect, so this ideal job opportunity is closed to him. In fact, when his genome is mapped (at birth), it is determined that Vincent has a high probability of several genetic disorders and a projected lifespan of only 30.2 years. But the force of his will is not to be underestimated. With a little luck, a lot of work, and one very intense sacrifice, Vincent is able to purchase and assume the identity of Jerome Morrow (Jude Law), a perfect human specimen whose competitive swimming career was cut short when he was paralyzed in a car accident, and who has now withdrawn into alcoholic self-annihilation. We join the narrative with Vincent already established in his double life, and the tension of the movie flows from whether he can keep up the ruse long enough to board the ship on his launch date and venture into space, or whether he will be caught by detectives who are investigating a murder which has taken place at Gattaca, and who are suddenly scrutinizing its workforce to determine who might be an “in-valid” (a person masquerading like Vincent is).
I didn’t think my children’s children would one day live in Gattaca’s oppressive dystopia, literally. But I knew they sure weren’t going to wake up in Clinton’s rosy future—where suffering is but a memory—either. This new frontier was, in all likelihood, going to be more similar to President Jefferson’s than President Clinton had meant to imply with his analogy. For someone, suddenly living in the genetic future was going to be like suddenly living in the United States of America was for the native peoples who had inhabited the area of the West since before recorded history. That is to say, the future would be something substantially less than the complete success which the Human Genome Project formally declared itself in 2003.
Elsewhere, The Truman Show yielded changes even more immediate. Within a decade of its release, the biggest thing happening on television was programming about the actual lives of real people. “Reality TV” had existed prior to The Truman Show in embryonic form, as FOX followed the men and women of law enforcement (and the men and women of humiliating contacts therewith) on Cops, and MTV was figuring out what happens when seven strangers are picked to live in a house together and have their lives taped on The Real World, which was said to be where people “stop being polite… and start getting real.” While these shows were successful during the 90’s, they were nothing compared to what was to follow. Reality TV, in its fully realized iteration, began with game shows like Survivor (which was an immediate hit in 2000 for CBS and remains strong today) which encouraged human beings to behave at their worst and betray each other for their own gain. While televised competition had existed since the inception of medium, the history of game shows over the decades like Twenty-One and Name That Tune through Wheel of Fortune and Supermarket Sweep had failed to weaponize the darkest impulses of fame-and-money-hungry participants the way Reality TV’s game shows (like the aforementioned Survivor, Temptation Island, The Amazing Race, and so many others) began to in the early 2000’s. It helped, of course, if the worst people were selected for the mission, and then placed in contrived pressure situations, like animals forced into combat to perversely amuse spectators. Reality TV’s producers (themselves probably not our civilization’s best exemplars of ethical virtue) dramatically innovated in all of these dark endeavors. It was an entirely logical progression when, from this sinister ether, Donald Trump’s The Apprentice oozed into being in 2004.
Soon it became increasingly clear that the participants needn’t even be competing over anything for their worst traits to be exploited. From MTV’s hit with Laguna Beach: The Real Orange County (2004) to a success probably more monumental than any other in the history of cable television with Keeping Up with the Kardashians (2007), the ordinary doings of people (even if those people were not ordinary, but instead very rich and subsequently very famous) were what defined Reality TV as the years marched on. The idea was, to increasing dollar value, that you would point a camera at someone, start rolling, and capture not a game show, or a scripted drama, or a sporting event, or a news broadcast, but simply the mere way they had chosen to spend their time, and it would be compelling for people at home to watch. This absolutely seemed to validate the thesis at the heart of The Truman Show.
Carrey’s Truman Burbank, of course, is much more mundane than all of that. He has lived his entire life on Seahaven Island, a chokingly pleasant seaside community where Truman sells insurance and lives with his wife, Meryl (Laura Linney). In his spare time, he enjoys driving golf balls with his best friend, Marlon (Noah Emmerich). Jim Carrey plays Truman in the same tones in which Peter Weir captures most of the rest of the film: the happy-go-lucky stereotype of 1950’s and 60’s television. The whole thing looks like an episode of The Donna Reed Show or My Three Sons, but in bright colors. Where drama and conflict exist in this world, it doesn’t verge much beyond that experienced by characters on The Many Loves of Dobie Gillis. All of the other characters (like Meryl and Marlon) are paid actors, with Truman being the only person having an authentic experience on his show, albeit one wherein he is blinded to reality as though chained in Plato’s cave. That all begins to change in the film’s opening moments, when a production light falls to earth from the ceiling of the enormous dome that actually houses Seahaven Island and shatters, like Truman’s idea of his reality will soon begin to.
Niccol originally conceived of The Truman Show as something much darker. The 50’s sitcom aesthetic in the eventual product is the influence of director Weir, who had Niccol write sixteen drafts before he considered it ready for celluloid. One of those older drafts has been (supposedly, there’s no verifying the document’s authenticity) available online for years, and it is much more like Gattaca than what audiences eventually saw in the summer of 1998. While many of the events are the same, they are tonally different. In the movie, Jim Carrey’s Truman dreams of leaving Seahaven and reuniting with the girl who got away (was fired). The original Truman had similar feelings about his environment, the more urban “Utopia, Queens.” But only the latter reacts to his dissatisfaction with life by secretly consuming Jack Daniels in his car in the morning. That original draft has much more the feeling of being a thriller about escape from a world that is a sort of prison for its protagonist, like Gattaca. Other traces of the original, dark Truman remain as well, sometimes in surprising ways: when Truman has begun questioning his reality and begins to be overtaken with paranoia, Meryl attempts to diffuse him. In a brief moment, perhaps the film’s most dramatic beat, he yells aggressively at her. While it is as caged in comedy as much of the rest of the film, Truman violently grabs Meryl, and in that moment she is afraid for her life, which is exactly how Linney plays the interaction. If The Truman Show was made as Weir’s comedy in 2019, I do not believe this moment would make the final cut. The escape element survives into Weir’s film, dominating the third act, but the director has to painfully remind us of how unserious this business is by concluding the movie with Carrey repeating his saccharine catchphrase, “Good afternoon, good evening, and goodnight!” before departing the dome to continue his adventure without us. Niccol’s old draft has a more conclusive and confirmative happy ending for Truman (it implies he finds the girl and they have a child), but his last line, spoken before the aforementioned bliss is told merely through images, is actually “Something had to be real!” shouted in the moment he confronts his creator, Christof, who was eventually played in the film by Ed Harris in a performance that was Oscar-nominated. In that moment, within the original draft, Truman is also himself being confronted by the vapidity and fakeness of the reality he has spent 34 years experiencing, and the stage direction Niccol leaves for the actor describes the aforementioned line as “a terrifying anguish.” In Weir’s film, Truman’s overall reaction to the revelation could be described as something more along the lines of “golly.” Earlier in this essay, I described The Truman Show as the film that introduced Jim Carrey as an actor in serious movies, rather than calling it the one which introduced him as a serious actor. This wasn’t by accident. That movie was Man on The Moon, which came out 18 months later. There is serious acting in The Truman Show (particularly by Harris and Linney), but Carrey doesn’t do much of it. His performance doesn’t veer into anything too graduated from what audiences saw him do as the bumbling dad set upon by a supernatural curse in Liar Liar a year earlier.
It didn’t take a Richard A. Clarke-level genius to note Truman’s foresight about Reality TV in the decade following the film’s release. If The Truman Show was mentioned, typically this connection was drawn. In 2008, Popular Mechanics named it one of the ten most prophetic science fiction films of all time. In that piece, Erik Sofge argued that, while Truman’s environment is unreal, so is the contrived environment of Reality TV, but what is compelling about both is the same element: “Truman simply lives, and the show’s popularity is its straightforward voyeurism.” In his analysis of the film’s plausibility, Sofge took issue (perhaps appropriately for a Magazine that is supposed to be about science) only with a futuristic weather-machine Christof uses in an attempt to slow Truman’s escape. Weir has also commented on the ensuing developments on the small screen, saying in a 2005 DVD extra: “”This was a dangerous film to make because it couldn’t happen. How ironic.” Reality TV’s success might discredit Weir’s choice to make Truman’s a comedic world, which even at the time of the movie’s release he said he had done because he didn’t believe it would be convincing to film audiences that fictional (meta) television audiences would watch a show about a regular person. While it would be absurd to call The Truman Show a failure for Niccol (it, much more than Gattaca, is the reason people are willing to underwrite his movies to this day) time has seemed to imply his unfilmed draft was more on the right side of history than Weir’s realized production.
Perhaps, when sitting down in the early 90’s to write screenplays which were unlikely to ever be produced, all of this was obvious to Andrew Niccol. In retrospect, with Reality TV’s progenitors already on the air and the Human Genome Project an inevitability since at least the discovery of DNA’s structure in 1953, it should have been obvious to everyone else too. But in retrospect, it’s easy to label what has happened as obvious.
From the comfortable vantage point of retrospect, both 9/11 and the Financial Crisis seem completely obvious. Take 9/11: how could one have predicted that Osama bin Laden would strike the United States? For starters, al-Qaeda had made no secret of their ambitions. It’s popular (and not incorrect) to believe that the national security establishment failed to heed the danger of a major domestic terror attack by Islamic extremists of the sort already very active elsewhere in the world. But the general public can’t credibly claim ignorance either. Bin Laden gave several interviews to western reporters during the 1990’s. Sure, a lot of people threaten a lot of crazy things, but anyone who believed him all-talk would have done well to notice when al-Qaeda literally attacked the United States in 1998 by bombing its embassies in Kenya and Tanzania, and again in 2000 when the USS Cole was bombed in Yemen. If these events seemed too remote by virtue of their place on foreign soil, there was then the (thankfully averted) “millennium” plot to bomb Los Angeles International Airport in December of 1999. And if all that wasn’t enough, of course, extremists linked to al-Qaeda had already bombed the World Trade Center in 1993. These things were not covered up or classified. All of them made The New York Times.
Still, the hijacking and weaponizing of commercial airliners was unprecedented, right? This is a dubious assumption as well. The hijacking of commercial flights by terrorists was routine during the 1970’s. This is well known. Less well known is the story of Samuel Byck, a mentally unstable Pennsylvanian who was killed (but not before managing to kill a pilot and a police officer) in February of 1974 while attempting to hijack a plane at Baltimore/Washington International Airport, so that he could fly it into the White House and assassinate Richard Nixon. This is not common knowledge but again, not a secret—it was reported at the time. Byck was actually known to have threatened Nixon after being denied a loan by the Small Business Administration, and had even been interviewed by the Secret Service, but they had believed him harmless. All-talk. While Byck never got off the ground, that didn’t stop it from being a bad week for air-security around 1600 Pennsylvania Avenue: five days earlier, a jilted army private named Robert Kenneth Preston had stolen a military helicopter from Fort Meade, Maryland and (after taking a flying tour of the Capitol and successfully evading the jet fighters sent to intercept him) landed it on the South Lawn, whereupon the Secret Service opened fire on him with automatic weapons and shotguns. Miraculously, he suffered only superficial wounds such that he was able to depart the aircraft and charge the building, where he was finally tackled. For those not counting, these events took place approximately three decades before 19 young men set their alarms early on a Tuesday to change the course of history. It bears mentioning that this is the same White House as is widely believed to have been the target of United 93, the one 9/11 plane on which the terrorists were prevented from completing their objective, when a revolt by its passengers caused the plane to crash in rural Pennsylvania, killing all on board.
The Financial Crisis is both more complicated and more predictable. The complexity is inherent because of the innate complexity of economics as a discipline; like the Great Depression before it, the various causes (and various solutions) of the Financial Crisis will be debated for long as the event is remembered, even if only in academia. I have read five dense books on the subject, and remain wholly unqualified to deconstruct the systematic risk to the economy that had come to exist in 2008, let alone advise with any confidence how a repeat could be avoided. Many would tell you that greed was the cause of the crisis, but this is a bad explanation. Greed exists and is often a destructive element in human affairs, but greed was not invented at Bear Stearns or Lehman Brothers. Greed is as old as human agency. It is perhaps as simple as being the economic manifestation of hubris. The Biblical cliche is “thirty pieces of silver,” not “thirty credit default swaps.” There was plenty of greed in every prior Wall Street collapse, as well as in every celebrated boom. While relative compensation had increased in the financial sector (even if to a sick degree in its ostentation) by 2008, mere greed cannot explain why we got this crisis when we did. Again, the reasons this happened are complex. They do not fit a moralizing agenda intended to teach a parable about greed, no matter how true it is that our society includes many who need to be so taught.
Nevertheless, it seems inarguable that the crisis would not have transpired without the failure of the housing market that began in 2006. Why the housing bubble grew, and burst, is the subject of entire books itself, but it isn’t difficult to summarize. Picture the least financially-stable member of your extended family. For purposes of this discussion, call this person “Cousin Larry.” Maybe Cousin Larry is irresponsible and reckless in managing his money, and leaves you forever shaking your head at the decisions he makes. Or maybe he is solid and trustworthy, but circumstances beyond his control have always left him barely able to keep his head above water. His character isn’t what’s important. The variable to pay attention to is his financial instability. He may have had a lifelong difficulty in finding a job that provides him a sustainable income, or he may be quite well-compensated but nevertheless always wind up with liabilities that exceed his assets. Now, ask yourself the following question: are you willing to co-sign a mortgage for a five-thousand square foot home in an expensive neighborhood for this person? If your answer is anything other than “Yes. And the nation’s entire banking system, private pension infrastructure, and the federal government should all co-sign the note as well. Oh, and is there anything available with six-thousand square feet?” then you are wiser than former Federal Reserve Chair Alan Greenspan. Basically, in the early 21st century, the nation’s entire economic viability had come to be riding on whether or not Cousin Larry was going to be able to keep up with his debt. And by 2006, Cousin Larry had the aforementioned home, as well as mortgages on three other properties that he was trying to “flip”, to take advantage of the hot housing market that the rampant buying of so many cousins-Larry had created, not to mention the fact that he was also making substantial payments on a new Hummer financed over six years, which he used to travel back and forth from The Home Depot where he bought supplies to renovate those other properties, purchased using yet a third kind of debt in the form of a high-interest credit card. Sound ‘sound’, so to speak? And through a myriad of complexities which, like the threat of terrorism before 9/11, were not a secret, practically the whole of the economy had been bet on whether Cousin Larry’s plan was sound. It must be said that, in the devastating event that followed, many good people who were not stretched as thin as Cousin Larry would lose their homes to foreclosure. Cousins-Larry were never a majority of American home buyers. But without the economic behavior of Cousin Larry, there would never have been a crisis so large as to engulf them.
If you have a real-life Cousin Larry whose behavior you have observed over the years, you would probably describe how things worked out in the economy as entirely predictable. And while some did point out the growing risk in the system, they were few and far between, and more seldom heard. If greed were really the factor to pay attention to, all of the greedy people would have taken The Big Short’s eponymous short play, as it was the key to untold riches for those who believed in it. But the greedy instead pressed on with their flawed confidence in housing as bulletproof, and the collapse of 2008 burned them too. Perhaps this is because, in any bull market, there is always a contingent predicting impending doom. It’s never too bad a prediction. Financial downturns have occurred regularly throughout our nation’s history. The question is never if, but simply, how long until. In a world where naysayers can reliably exist, they can reliably be ignored. The 2006 housing bubble is now to the 2008 collapse what al-Qaeda’s attacks in 1998 and 2000 are to 9/11: an utterly clear warning about what was to come, but one that only shines with that clarity through the lens of hindsight.
I say it with the confidence exclusive to someone writing more than a decade after either event. Full disclosure: I am the last person who saw these things coming, or even recognized them for what they were, when they did. When the first plane hit the World Trade Center, for example, I was watching a rerun of Beverly Hills, 90210 on the FX Network, and spent an embarrassing amount of time believing (as the newscasters repeated for an embarrassing amount of time) that it was an ordinary accident involving a small aircraft, and wondering when the news coverage that had broken into my broadcast would abate, and 90210 would resume to reveal to me the fate of Dylan McKay within the episode’s narrative. While both 9/11 and the Financial Crisis seemed totally unprecedented as they were unfolding, it was still not clear contemporaneously that these events would forever change American life. Even after the ink was dry on national legislation like the Authorization for Use of Military Force Against Terrorists, The USA Patriot Act, or the Dodd–Frank Wall Street Reform and Consumer Protection Act, how much bearing these new laws would have (and how permanent they might be) was an open question.
The permanence of 9/11 was still up for debate (though not much debated) by the time another Andrew Niccol movie came out in August of 2002, when a second Iraq War was still a hypothetical. Simone (sometimes called S1m0ne), which he wrote and directed (as he would all the films he has subsequently made, rendering his collaboration with Weir an outlier), debuted in ninth place. The movie earned just shy of $4m during a crowded summer weekend at the box office, and would go on to generate less than $20m worldwide; a feat that even The Crocodile Hunter: Collision Course was able to manage that summer. But unlike the similarly-performing Gattaca, the critics weren’t so much on Niccol’s side this time. Even if the reviews weren’t uniformly bad, they didn’t think anything particularly interesting was going on in this movie either. For a filmmaker who had, by that point, built a reputation for creating worlds that were thought-provoking, this is probably the unkindest possible cut.
That Simone is uninteresting as a narrative motion picture, fails to take itself seriously in the worst way, and further fails to deliver even on its limited premise, are all points I will not contest. I absolutely believed all of these things when I saw it in 2002. But if I had instead reacted by going to grad school and making a life for myself as an academic specializing in the work of Andrew Niccol, I have no doubt I could be on my 50th journal article about Simone by this point. I could be the world’s foremost Simone scholar. This movie is awful, but still I have that much to say about it. Simone is a masterpiece of recursive irony, none of it intentional.
The movie is something akin to what was once termed a “screwball comedy.” These were satires from the 1930’s and 40’s that typically center on a humorous battle of the sexes. In 2002, the film was classified by some as science-fiction, but anyone watching it today will find this reasoning as veracious as the contemporary notion that Saddam Hussein was on the verge of possessing a nuclear weapon. The Crocodile Hunter movie I mentioned earlier is actually about Steve Irwin mistaking CIA agents for poachers after a crocodile swallows a top-secret U.S. satellite beacon, but that doesn’t make that movie a “political thriller.” So no, Simone is not science-fiction. While a technology that doesn’t yet exist may be integral to Simone’s story, this is entirely secondary to the shallow satire of Hollywood’s golden age that this movie spends 118 minutes concerning itself with.
Al Pacino stars as Viktor Taransky, a film director at his wits’ end with the temperamental actress Nicola Anders (Winona Ryder). Get it? “Nicola Anders”? Like “Andrew Niccol”? Neither do I. Anyhow, when she storms off his set and refuses to complete his latest movie, it appears the project will have to be abandoned, which likely will put the final nail in the coffin of his flagging career. But, as fate (fate being primarily characterized in this world by its zaniness) would have it, a dying oddball computer programmer leaves a secret hard disk to Taransky in his will. When the director turns it on, voila—he is met with an advanced piece of software known as Simulation One (or Sim-One, or Simone, get it? perhaps) which is a computer-generated actress played by the model Rachel Roberts. Taransky uses Simone to replace Anders in his movie, which is then a smash hit. The public falls completely in love with what they are led to believe is a real person, and Taransky is back on top. But how long will this ruse last? And how many bits of forgettable comic business—like Pacino using a Barbie to cast a silhouette on hotel drapes to fool the paparazzi, or putting lipstick on himself to kiss autographed headshots (yes, this literally happens)—can Niccol pencil into this film’s remaining two-thirds before concluding things with an ending that is both totally implausible, and 100% predictable?
The main flaw here, in a film with many, is that Roberts’ Simone has no personality. Her lines—which in the world of the technology onscreen are generated by Pacino reading them into a microphone, and the computerized Simone then parroting what he does, resulting in a presentation that haunts my nightmares—always have the flatness you notice when people not accustomed to being on camera read from prepared statements at press conferences, or when hostages are forced to do the same thing by their captors. “This is Simone, and I am making this statement of my own volition.” It all has that flavor. An obvious answer for why is that Rachel Roberts is primarily a fashion model, not an actress. That’s certainly what I assumed to be the case when I saw this movie in 2002. But the reality is something more complicated. Roberts hasn’t acted much since Simone, but when she appeared as a hippie who introduces Don Draper to hasheesh in a 2013 episode of Mad Men, none of the robotic stuff was evident. It was 100% believable and charismatic. This, despite the fact that storylines on Mad Men about the subject of Don Draper and hippies were about as effective as songs by Right Said Fred about subjects other than the problem of being too sexy. I had to take an acting class to graduate college, and after a semester of coursework, the main thing I had learned was that acting is not easy. Taking words off a page and making them sound natural from your own mouth (to say nothing of demonstrating high-level emotions) is really hard. Nevertheless, the role on Mad Men and a few other sporadic examples definitely attest that this is a talent Rachel Roberts possesses. She may never be Meryl Streep, but she’s definitely not the actress one would think if this movie were all that they had to judge by. So something else is going on here. While I can’t say what that is, I can only surmise that this is the performance Niccol wanted her to give. This, to her misfortune, is the product of his direction. Perhaps he was concerned that having a real person play a computer-generated actress would not be effective otherwise, the way Weir thought a Truman Show wouldn’t be believable of it resembled actual real life. This isn’t a bad thought, except for one reality that can’t be ignored: the entire plot of this movie hinges on the supposition that moviegoers (and literally everyone else, her impact as a public personality comes to grow beyond that industry) would see Simone and fall wildly in love with her. At one point, tuxedoed partygoers run crazed and topple over into a body of water merely because they mistake another woman from behind for Simone. This is the reality (however heightened) Andrew Niccol wants us to accept in this movie. And Roberts’ android-like presence is what that reality is predicated on. While, like I said, I don’t believe this is a science-fiction movie, the problem at work here is one you typically see in that genre.
In science-fiction, the rules can be whatever you want them to be. The only limiting principle is that they have to be consistent. If George Lucas had wanted to give Luke Skywalker the power of flight in The Empire Strikes Back, he could have done it. Luke does other things that are vaguely super-powered. It wouldn’t have changed too much about the movie. Audiences in 1980 would have just accepted that this was one of the ways he uses the Force, and nobody would have said a thing about it in the last 40 years. What George Lucas could not have done, is give Luke the power of flight when he’s training with Yoda, and then had him still fall down that shaft and out into open air, clinging for his life, at the end of the movie after he fights Darth Vader. Audiences would have said, “Why doesn’t he just use that flying power we saw him demonstrate half-hour ago?” The rules wouldn’t be internally consistent. Andrew Niccol cannot have Rachel Roberts give this robotic, remote performance, and then not only ask us to believe that people in this world wouldn’t question her reality, but would further fall more into psychotic love with her than teenage girls do with the Beatles in A Hard Day’s Night.
The power of Simone’s failure on that point is astonishing. It is so sweeping as to make Catherine Keener seem like a bad actress. I’ve never had this reaction to Keener, even in a bad movie, but Niccol’s ability is such as to be able to bend reality to this Herculean end. Keener plays Pacino’s ex-wife, and his boss at the film studio. Through a series of misunderstandings, people begin to believe that Taransky and Simone are romantically involved, when what the director actually wants is to reunite with Keener’s character so that they can be a happy family again with the the daughter they share (a very young Evan Rachel Wood). Just when it seems that Taransky has leveraged his imaginary romance with Simone into successful manipulation of his wife toward this goal, and they embrace, she abruptly breaks it and pulls back: “Oh Victor, I could never compete with Simone. How could any woman?” she says. This utterly unconvincing scene is the worst acting of Catherine Keener’s career, and nobody could blame her.
I’m not ordinarily one to criticize movies for being sexist. The kinds of movies whose sexism I might find objectionable are typically ones I avoid seeing altogether. The horror genre contains many such entries. Those are obvious. But I realize there are other, more subtle ways for sexism to figure into a movie, and this is further complicated by the fact that one’s impression of art is a pretty subjective experience. That said, the sexism in Simone is pretty difficult to ignore. Winona Ryder has too much agency, and she’s too mouthy, so Al Pacino replaces her with the ideal woman: a machine whose every move is literally the execution of his will. Then he uses that fake woman to manipulate yet a third woman (Keener) to have sex with him, but that plan fails only because she is stupid enough to believe his sham. When even the fake woman becomes too much of a problem, he (and SPOILER ALERT here) LITERALLY MURDERS HER. Sure, she’s not real, but that doesn’t stop him from accidentally being framed and jailed for her killing in a sequence that’s pretty dark, before a poorly explained deus ex machina returns things to normal and the credits roll. 2002 was (was it?) a less enlightened age in mainstream entertainment, but I’m pretty sure we had already evolved beyond onscreen worlds where women must exclusively be actual puppets for the male protagonist without any identity of their own, or, at their most evolved, vindictive harpees who have the sole function of thwarting his will. If I’m reading into things here, in fairness to me, those things are written in some pretty large print.
It’s arguable who gives the best performance in Simone. It’s probably Ryder because, though her role is limited and not well written, the actress is always a compelling presence. But I would argue an alternate theory: the bright spot in the movie is Evan Rachel Wood as Taransky’s daughter. This is the relationship that comes closest to seeming like something that is both real, and worth paying attention to. This is very ironic because, more than a decade later, Wood was nominated for a slew of awards (including an Emmy and a Golden Globe, as well the Critic’s Choice which she won) for playing an actual robot on HBO’s Westworld. That show, which is both lauded and controversial, takes an entirely different path in addressing artificial humanity than Niccol did. It presents androids as not differentiable from human beings, in a world where the humans generally know they are fake, and imagines the ethical complexities that would result. The result is a much more engaging and realistic world than a single frame of Simone manages to be.
But all of this isn’t why I could write 50 papers about Simone, even if you feel like you’ve read that many by this point. What’s really interesting about this movie is the context. It is sometimes credibly argued that any piece of art is about its creator more than anything else. This is analogous to the truth that any dream is about the dreamer. If you dream about your parents, you will not learn more about them from the dream, because they weren’t actually there and whatever they were doing was just a manifestation of your own thoughts and feelings about them. But in appraising that dream, you may learn something you didn’t consciously realize about those thoughts and feelings, and hence, about yourself. By this theory, all of Andrew Niccol’s movies should communicate something about him as a person, and his particular worldview. If this is true for any film, it’s doubly true here, because this movie is about a film director. Not only is that Niccol’s actual profession, but it’s also a very unique profession.
The power of a film director over the people he or she controls is enormous. It is the nature of big studio films that these people generally have absolute control over armies of hundreds of employees for the duration of a production. Anyone who has spent any length of time on a film set knows that, while a director may be (and often is) a very kind person who respects everyone’s dignity, those who are not are generally allowed to flaunt their power without any accountability for how outrageous their conduct might be. Producers and others who control money can control directors, but it is completely ordinary for producers to stay out the way, barring something totally unusual or, more often, prohibitively costly. Producers often spend their working days miles away in offices, where they control and subjugate their own separate army. The set as the director’s domain is absolute. When a director behaves badly, that director’s army will generally still execute their will unquestioningly. At no level is the work like that of an ordinary job. The culture is that, if one challenges something they believe is wrong, they must do so with the very real fear that not only will they lose this job, but be branded as someone difficult to work with in this entire, very insular industry. The director Stanley Kubrick is said to have abused the actress Shelley Duvall with such acute psychological torture on the set of The Shining as to nearly ruin her as a human being. Kubrick sought to isolate her such that she felt totally alone. He frequently told her she was wasting the time of everyone on set. He can be heard in a recording telling others “Don’t sympathize with Shelley.” I have had bad bosses, but never has one instructed a coworker to deprive me of sympathy. In any other work environment, it would be simply preposterous, if it wasn’t primarily so twisted an annihilation of a basic human need. But this is art. Kubrick is and forever will be hailed as a genius.
Many revelations of recent years, in the era of #MeToo, are unsurprising when viewed in light of this information. There’s a lot of finger pointing about who knew what about the abuses of Harvey Weinstein, but nobody who came anywhere near his orbit can credibly claim they didn’t know he was a tyrant. Like a tyrannical director, a tyrannical producer such as Weinstein is often celebrated for the force of will that underpins their artistic conquests. In addition to his closet full of Oscars, Weinstein was honored by GLAAD, the Producer’s Guild, and Harvard University. Nothing I have thus far alleged about Weinstein was controversial to say when those awards were given. Peter Biskind wrote Down and Dirty Pictures, an entire (not obscure) book chronicling Weinstein’s megalomania, way back in 2004. It didn’t seem reason enough, to Hollywood, to challenge Weinstein’s reputation as a global humanitarian. Somewhere, a former low-level Miramax staffer is right now authoring a check to a psychiatrist, and disagreeing. That Weinstein was also a sexual predator whose power-madness was so extreme as to motivate him to trap journalist Lauren Sivan in a restaurant basement and force her to watch him masturbate into a potted plant is shocking and disgusting, but it is not revelatory about his character. It completely fits with what everyone already knew about him 20 years ago. Biskind’s book seems today, at best, incurious for having missed Weinstein’s predations with women, but that book does conclude with the mogul physically assaulting a journalist. Again, published in 2004.
This is the film industry. It is not like other industries. To many who work in it, the only part of 1994’s Swimming with Sharks (a breakout role for Kevin Spacey as a monstrously cruel Hollywood boss) that seems beyond belief is its tidy ending. Many people, in other fields, have had a boss they think of as cruel, or pervy, or who crosses boundaries, but media is simply peerless in this regard. While #MeToo may go a long way toward stopping the sexual crossings of the line, it may be limited to that arena. “Okay, no more sex stuff,” one can imagine these sociopaths limited their self-appraisals to. It’s very likely that ordinary plutonic abuse of human dignity will continue to be tolerated. That psychiatrist is going to be in business for a long time. Whether or not Andrew Niccol meant to graze these issues, this is the same Hollywood which he chose to write about in Simone.
Now I don’t know anything about Andrew Niccol as a human being, or as the manager of a film set. I have literally not a single reason to think he’s a showbiz sociopath of this sort. When he is interviewed, he seems like a quiet guy from New Zealand, which is probably what he is. But any person, no matter how nice, takes the director’s chair with the full weight of the foregoing information about Hollywood power dynamics. This is what you should bear in mind, when I relate the following facts as I understand them:
When Andrew Niccol wrote and directed Simone, he was a film director in his thirties, and married with a young daughter. He chose to write this movie about an older film director, who is divorced with a teenage daughter, and who becomes the subject of rumors that he is having an affair with his young artificial movie star, who is actually his Frankenstein creation. He cast, in this role, Rachel Roberts, a fashion model, which is a role in society generally associated with artificiality. On August 13, 2002, he attended the premiere of Simone with his wife, the actress Susan Jennifer (Grace) Sullivan, who is best known for Friday the 13th Part VII: The New Blood, a movie where Jason murders her by slamming an axe into her face. Niccol further chose, in crafting his movie, to have the movie director character be accused of the murder of his young actress. What happened next is unclear. The gossip pages don’t usually report on obscure film directors, nor actresses if they are not otherwise of note. However, when 2002 ended, Niccol was no longer married to Grace Sullivan. Sullivan later became the subject of an internet rumor that she had died in 2009, which was so pervasive that IMDB to this day lists her as deceased, though online Friday the 13th fans claim to have debunked it, and have compelling proof that she was alive at least as recently as 2013, though she has battled cancer and shared her story of overcoming it online. However, that very strange tale is only a digression to the fact that, by the time 2002 ended, Andrew Niccol had somehow become the husband of Rachel Roberts.
There are ironies within ironies within ironies to explore in this odd, incomplete narrative. Niccol and Roberts are married to this day, and she appears in small parts in nearly all of the films he makes. Perhaps he expected Simone’s audience would take for granted that she was lovable on the screen because he had fallen in love with her, and was too blinded by his love to realize that what he saw in her was not translating in the android-like screen presence he imbued this movie with. Like I said, Andrew Niccol seems, in interviews, like a very low-key guy. He is not crazy. He is a successful auteur and will probably live happily ever after, as much as anyone does outside of the movies. But I guarantee you that if ever he were to achieve infamy for some reason, like by committing a sensational crime the way celebrities sometimes do, Simone is the exclusive film of his that internet crime-obsessives would parse for clues. Its self-examining rabbit hole would take up the whole screen.
Out of Niccol’s visions of technology’s future, Simone is both the most mundane and the one that seems most inarguably certain to be reality in the near term. We can already create digital actors, and one day soon they will be as vivid as real people. Whether or not an artificial celebrity will eventually conquer pop culture the way Simone argues, time will only tell. The takeaway is, Niccol fails in convincing viewers that it’s possible in the world he authors here.
Ultimately, Simone’s achilles heal is its misconception of human psychology. In this situation, faced with this technology, people would simply not act as Andrew Niccol imagines they would. As time went on in the years following the release of all three of his aforementioned films, it became apparent that something like this was plaguing his work from the beginning. In 2003, Spike TV tried to create The Truman Show as a literal concept. The Joe Schmo Show was about an ordinary person, Matt Kennedy Gould, who believes he is having ordinary interactions with other ordinary people. Unlike Truman, Joe Schmo knows he is on a reality competition show, but that is all he knows. He is unaware that all of the other characters are actors and what is happening to him is a scripted contrivance. The result? It wasn’t compelling television. Joe Schmo’s was not the face that launched a billion dollar enterprise like the kind pictured in The Truman Show. It ran for two years and was a hit relative to the size of Spike’s audience, but did not make waves, even as Reality TV was picking up steam. Spike tried to revive the concept in 2013, but it didn’t last then either. Gould’s reaction to learning of the hoax in the first season’s finale also rebuts Truman Burbank’s authenticity as a character. While Joe Schmo did plaintively cry “What is going on?” he then received a $100,000 prize and enjoyed brief fame when the episodes aired, fully cooperating and not demonstrating a feeling that he was betrayed, despite the lies.
Truman’s lack of authenticity is apparent when the film is viewed today. Again, this is a product of retrospect bias. We know today what works and doesn’t and how, on Reality TV. If you have teenage children who enjoy Reality TV, they were not born when this movie was new. If you try showing it to them now, and advise them you are doing so because of their interest in Reality TV, they will be utterly baffled. What’s on screen won’t resemble the thing they like in the slightest. Today, The Truman Show is much less like Reality TV, and much more like 50’s TV, but most like 1998 middling comedy movies. It is striking, to watch now, how much less affecting it is than it was. This is because in 1998, Niccol had to compete with other visions of the future that were being contemplated at the time, and his unique one could hold its own. In 2019, the movie has to compete with the actual world as we’ve known and lived it, and The Truman Show is hopelessly outgunned on this front.
In the long run, Niccol’s original, dark vision may have actually defeated Weir’s campy vision. In 2014 the psychiatrist Joel Gold and his brother, the neurophilospher Ian Gold, penned the book Suspicious Minds: How Culture Shapes Madness. This book is actually an excellent primer on delusional belief in human beings, its possible causes, and its many manifestations. But the hook that draws the reader in (and what came to populate press materials and news stories about the book’s release) was Joel Gold’s work with patients suffering from what he termed “The Truman Show Delusion.” This is precisely what it sounds like: a false belief that a person’s life is secretly a TV show, and that this reality is being hidden from them. While the term was coined for Niccol’s work, the patients in the book exhibiting this syndrome often seem to be inspired more by the actual Reality TV as we know it. They are people who crave the self-importance of fame. Nevertheless, these are dark stories. The people afflicted do not come off like Peter Weir’s Truman or a Kardashian. If they have an analog (if there is an art this life is most imitating), it’s much more so that dark Truman of Niccol’s original draft. But it’s not surprising that the actual movie would still appeal to them.
In 500 years, nothing in The Truman Show will seem like the technology of the future. Many people probably will live in enormous dorms, as colonization of other planets becomes a reality. Cameras will be more ubiquitous than the movie ever conceived; it’s even possible that 3D scanners will record every moment of all of our lives in a perfect facsimile of what was experienced. Even the weather-machine isn’t beyond the pale. Everything in The Truman Show will appear, then, as a snapshot of a past age. That being so, what will a person living in 500 years describe this movie as being about? I venture the answer is paranoia. This is a movie about a paranoid person who develops an absurd contrarian belief that doesn’t match reality, and which everyone who loves him tries to dissuade him from thinking. Of course in the end, he’s right. Just because he was paranoid, didn’t mean they weren’t out to get him. Or at least use him. This film is a very powerful statement for real people who are suffering delusions. It says that even when reality is constantly knocking back your warped perceptions, your hunch may actually be onto something, and in the end it will be validated.
It has to be said that delusions are never the product of their content. The Brothers Gold do an excellent job of explaining this. For example (my example, not theirs) a very clichéd delusion of the last several decades is the claim that the CIA put in a chip in your brain to spy on you. Obviously, this delusion can only exist in the world as we know it, where Congress chartered the CIA in 1947, and the microchip was invented through the work of Jack Kilby and Robert Noyce shortly thereafter. However, in a world where these things had never happened, the person who would otherwise have believed them is still not going to be a happy, well-adjusted one. He instead believes that the Illuminati are trying to assassinate him, or that General Motors wants to kidnap him and steal his revolutionary ideas, and so on. Seeing The Truman Show can’t make anyone delusional. The delusional simply turn on the movie and see a reflection of their sickness that functions as wild wish-fulfillment. Even in other movies that lend credibility to paranoia, you don’t ordinarily see the kind of everyman normalcy in the main character that Truman is imbued with. How could he be crazy? He’s the affable Jim Carrey.
That may last forever. Delusional people may still find this movie relatable in 500 years. But for regular people, The Truman Show looks less and less like the world they live in every year, not more and more. Certain things just jump off the screen as implausible, like when Truman is able to escape being tracked within his own world. Despite all those cameras, they can’t find him, because they don’t know where to look. This is a movie that is purported to have insight about where the world was going in 1998. I could say “blah blah GPS…” and make an overcomplicated argument, but a simple one will do. Here’s a question: if the writers of the aforementioned Crocodile Hunter were able to understand the technology of a tracking device, such that they could make it a plot point in their movie in 2002, which was for children, is it really reasonable for Niccol and Weir to ask that we believe Truman’s TV overlords wouldn’t have insisted on it? They have a fucking weather-machine.
In 2011, NASA named Gattaca the most plausible science-fiction film of all time. The impetus for even ruling on the issue was the then-newsworthy 2012 apocalypse conspiracy, which had spread much astronomical misinformation (remember the imminent return of Planet X?) and spawned a film of the same name. As you are now reading this, that movie did not predict the future, as the world could be concretely certain of when the year 2012 ended as predicted on December 31st, with the world still extant. But NASA felt the need to get out in front of the story (you imagine the Russians would have probably used a pencil and simply waited for the calendar to turn). In the 1960’s, it was a common refrain amongst contrarians to bemoan the fact that NASA was spending billions on spaceflight while there were still Americans living in poverty. Today, as vacancy on the moon is at historic highs, they’re apparently spending the money on ranking movies. Meanwhile, Jeff Bezos is managing to do both at once. The science honor for Gattaca wasn’t undeserved, because the science- of the film is painfully plausible, if not inevitable. Since Gattaca never speaks too much about how this is all being done, it doesn’t make any promises that are too big about the technology. The problem is, the -fiction is not especially realistic. It’s never clear why Vincent wants to become a celestial navigator and leave earth. Is it because they say he can’t? He’s not a defiant character otherwise. The closest explanation of his motivation seems to be evident in his need to beat his brother in swimming contests, and these are presented as being thematically important, but they don’t really shed light on his character the way one surmises Niccol might have intended them to. He’s a straight-laced square who fits perfectly into his Gattaca box. He has no problem living with the superhuman discipline necessary to protect his hidden identity. The only cracks in his armor (the times when he seems to have the emotional dimensions of an actual person) appear in his budding romance with Irene, which has the unfortunate timing of beginning less than a week before he is supposed to launch. But if this gives him any hesitation about leaving forever; if he is at all torn about making the old Irish exit and never seeing her again, we don’t see it. What awaits him in space? Why go?
Believe it or not, 2011 was not the first time NASA had offered official findings on the subject of Gattaca. In 2007, the agency published Societal Impact of Spaceflight, a scholarly collection that examined whether the last five decades of venturing forth from earth had been worthwhile for civilization. In an essay titled Are We a Spacefaring Species? Acknowledging Our Physical Fragility as a First Step to Transcending It, the cultural critic M.G. Lord confronted the human limitations on space travel (such as our ability to withstand radiation of the sort space is just famous for) and along the way checked Gattaca by name:
“Gattaca, a1997 movie written and directed by Andrew Niccol, describes a space program in which astronauts are chosen based on genetic superiority. In the movie, this is a bad thing. The hero is a love child, not the product of a eugenics exercise. Yet the hero’s short-term triumph—securing a spot on a desirable space mission—may seem less admirable if his weak genes expose him to a fatal illness. In the1970s, in vitro fertilization procedures were uncommon as well as ethically suspect. Today they are performed frequently and not considered an ethical problem. Likewise, other eugenics procedures that are not approved today may become commonplace in the future.”
Putting aside the bright picture the author sees for the future of eugenics, she is onto the movie’s fatal flaw. Vincent will probably die in space, proving his detractors right. It is fairly obvious in this movie that Niccol saw this story as an allegory about racism, or perhaps any of the unjust forms of discrimination that human beings are famous for. He has said in interviews that science-fiction appeals to him because it can be a vehicle for other ideas. But if that’s the case, Gattaca is an extremely short-sighted movie. Discrimination of the irrational sort has always haunted human experience, and much better artists have drawn light to it, particularly in the 20th century. What Lord sees, or at least sees in part, is that in a world with actual, fully-realized genetic engineering, there might utility in discriminating against someone like Vincent. The notion raises all sorts of questions (ones humanity is going to eventually confront) that do not get asked in Gattaca. Jude Law’s Jerome is a perfect human, and was a competitive swimmer, but became suicidal because he couldn’t do better than a silver medal. This is unrealistic. Somebody is always going to get the silver medal. Athletes confront this dilemma all the time. Someone who may be the best in high school will find in college they weren’t quite as good as they thought. And if they should be lucky enough to outshine most of the competition there, they will find themselves matched at the professional level. Even the greatest of all time are only marginally better than those who were their stiffest competitors. This is the world we live in. What Gattaca could have asked is if there would even be athletic competition in a world like this. Further, what if Jerome had just not particularly liked swimming? Or sports at all? Imagine if, instead of the movie we have, the story was about Vincent being genetically-designed to withstand radiation, and being prepared his whole life to do so, only to be conflicted because he loves Irene and wants to stay on earth with her? The actual Gattaca is very caught up in notions of superiority and inferiority, as though what a perfect genetic human would be is something that could be universally agreed upon. But this is never true. There are always too many variables, many of them qualitative. Conflicts like the one the Vincent of the imagined, hypothetical Gattaca faces, because it is predicated on an irrational, immutable human trait (romantic love) are going to be the interesting ones when the technology of the future arrives. That’s because these are the things that have always been interesting about the stories we tell each other, and the quality that defines those we canonize. When millions of students are forced to endure all five acts of Hamlet this year, and some find they actually like it, it won’t be because it provides some insight into the Elizabethan view of Danish Court politics, even if their English teacher wants to talk about this. Fiction, even speculative fiction, is interesting because of what it says about people. Hamlet is a play (mainly) about a young person confused over who they are, and what they should do. Young people will forever confront this dilemma, no matter how old Shakespeare gets.
In the modern era, one of the regular metrics applied to any piece of filmed entertainment from the past is whether that thing “holds up.” This is a shorthand to describe whether a thing, which was of quality upon its initial release, is still deserving of that designation at this point in time, when it is watched again. This can be applied to music or books, but most of the time it’s something you hear in conversations about movies or TV. In some ways, finding a thing doesn’t hold up is totally expected, like when you find that while you may have loved The Smurfs as a seven year-old, it’s not something with which you want to spend your time at 40. But discussions about the ability of something to hold up are much more nuanced than that. Today, people regularly find themselves reacting quite differently to something they watched only a decade ago, while they were in their adult body and brain.
Any discussion about speculative fiction in screen form would be incomplete without a mention of Rod Serling’s The Twilight Zone, the primetime sci-fi/fantasy anthology series that ran for 156 episodes from 1959-64 on CBS. The only thing more remarkable than the fact that this show somehow produced an average of 31.2 episodes a season which were judged to be largely good (often great) is the show’s ability to hold up, even sixty years later. Lest you think that they were widely distributing the workload, 128 of those episodes were written by just three people (Serling, Richard Matheson, or Charles Beaumont). Yet, these stories were (and are still) compelling to watch. What the future might be like was often the subject of that week’s story. Today, many of these still have resonance. The Brain Center at Whipple’s (episode 153) was about an industrialist who begins replacing his workers with advanced machines (not the only episode on the subject), only to suffer the same fate in a chilling final scene. The Trade-Ins (episode 31) featured an elderly couple who seek to exchange their failing bodies for younger models (also not the only episode with this concept), but are trapped in a The Gift of the Magi dilemma due to their lack of means. In the years since those episodes first aired, technology has progressed by leaps and bounds in both replacing human workers with automation, and in extending human life. Real people have had to deal with ethical questions about both much like those the characters faced, even if the technology still lags behind Serling’s vision. But with artificial intelligence and bioengineering continuing to advance, it seems very likely the tech will eventually get there, and beyond. Nevertheless, these episodes will always have social meaning, because they are very plainly trying to imagine what actual people would do in this world. When that world exists, people will continue to be people. They will not magically start making choices that defy reason like Gattaca’s Vincent, or Truman, or (God help them) Viktor Taransky. The Twilight Zone’s black-and-white production values may make it look antiquated, but those aspects of the show have been static for more than 30 years. My first TV (a hand-me-down) was black and white, but I am very much a child of the color age. If I find this show tolerable despite the fact that it appears old, anyone can. People will be able to watch this show for a long time, should they choose to.
And they have been. The show is currently airing its third reimagining on CBS’ All Access streaming service. That endeavor is being shepherded by Jordan Peele, whose recent Us was the highest grossing movie of 2019 in the United States that wasn’t based on a preexisting property. And that designation is only partially accurate: Peele was inspired to write the movie by a 1960 Serling-written Twilight Zone called Mirror Image (episode 21). In that particularly terrifying half-hour, a woman enduring a long wait in a bus station is suddenly confronted with a doppelgänger who wants to take her place, but (as is often the case) nobody believes her. The Twilight Zone inspiring other writers to create their own stories, which then become extremely successful at the box office, is not without precedent. Many claim it as an influence on their work.
The Truman Show bears many an absurd similarity to a 1989 episode of The Twilight Zone’s first revival entitled Special Service, even having its events set in motion when the main character (“John”, the Truman figure) inadvertently discovers a camera when a wall falls in the opening scene, like Truman with that production light. An irate John is then faced with the dilemma of whether he should quit (escape) his show, or go on pretending as though he had never learned the truth. By Contrast, Peele’s Us has so little in common with Mirror Image that it’s hard to picture anyone with a broad enough knowledge of science-fiction to have seen both connecting the two, without being prompted to do so. But that didn’t stop Peele from repeatedly mentioning Mirror Image and Serling while doing press for Us, though this may have been because his deal with CBS was already inked. Whether Niccol ever saw Special Service is unclear. But he and The Truman Show were nevertheless nominated for a Best Original Screenplay Oscar which, even in a possible world where Niccol had sinisterly stolen this story whole cloth from Special Service, it would still be a historical abomination that Truman somehow lost that award to Shakespeare in Love, a film which, 1) achieved its large-scale Oscar bounty through a definitely sinister campaign by aforementioned unconventional horticulturist Harvey Weinstein; and 2) I can promise you, definitely does not hold up.
It’s not by accident or changing taste that holding up has become a standard the culture asks media to meet at the present moment. If I were trying to write this essay 20 years ago about films that were the same relative age as those I’ve mentioned, I’d be woefully unable to do so in such detail. This is because at that time, here-today-gone-tomorrow was the nature of our experience of movies. Back then, one was generally limited to what was available in their local video store. That video store had a mix of VHS tapes made up of new releases, recent releases, and timeless favorites. Shelf space was at a premium, and it would be the unwise video store proprietor who displayed titles that nobody was likely to rent. A title would be released and spend a limited amount of time in the new release section before moving to the appropriate genre section, and then it would disappear forever. Very few movies achieved relative timelessness. These were films like the Star Wars trilogy, the Disney animated canon, and various similar others, but never too many. It just all evaporated within five years or so, as more new and recent releases needed that space. Even if you lived in a city with a lot of specialty video stores (typically just New York or Los Angeles) you would still be limited to the titles the operators of those stores thought appealed to their specialty customers. So, it is possible I would be able to locate The Truman Show in a Blockbuster, even though this is unlikely. Gattaca would not be there, but I could probably find it in a specialty store. Simone? Absolutely not. In 1999, a 20 year-old Simone would be out of print (and assuming it came out in 1979, it would probably never have been in print on VHS). It would be something you only heard stories about, if anyone remembered to tell them. It is not a movie you would even be likely to see on cable. If I wanted to write anything about it, and I didn’t somehow have access to a film print, I would be doing so purely from memory. And while my opinion of the actual movie hasn’t substantially improved or diminished since 2002, there was much I found I had forgotten when I sat down to rewatch it a week ago. And television? Forget about it. There was simply no practical way to ever see a TV show again. You could buy things like M*A*S*H or Star Trek through mail order at an exorbitant cost (just 1-3 episodes on each tape, and each tape ran about $20), but that was it. The Twilight Zone was available intermittently in this form, with a limited selection of episodes. When I was a kid, my video store had one tape of the series with four episodes on it, and that was all I experienced of it until I was nearly an adult, and my parents got the Sci-Fi channel around 1996, who were then re-airing it and still do so today. The idea of purchasing a “complete season” was unthinkable.
This all began to change, both for movies and TV, with the introduction of DVD, also in 1996. While it was initially a niche product for the high-end market, the economics of the format made it fated to outgrow that designation. DVDs were smaller and simpler, destined to become cheaper to manufacture, ship, and display for sale or rent. Meanwhile, online retailers like Amazon (and Netflix in the mail-order rental market) could offer unlimited shelf space. A host of things nobody had seen in years were suddenly available again, and people were first confronted with the reality that a thing they had once loved was actually terrible. Of course E.T. was still great, and that is why your video store had probably held onto their copy since its initial release on VHS, or repurchased it during one of its rereleases on that format. But on DVD, viewers could realize The Beastmaster (which they may have enjoyed side by side with E.T. in the summer of 1982) wasn’t quite as good as they recalled. The irony is, of course, that DVD now seems like something immediately antiquated. One day, its short reign will be a blip in the history of home media. Streaming video is now the standard way to enjoy these things, and it is likely to remain so for many years into the future, for probably at least as long as movies and TV themselves remain something we amuse ourselves with. The services and the business models may change, but if you are enjoying a movie in 100 years, it will be data that enters your home in an electronic form, rather than being inscribed on a medium like a disc or tape or anything else you obtain elsewhere and put in a machine.
“The future is now! Soon every American home will integrate their television, phone and computer. You’ll be able to visit the Louvre on one channel, or watch female mud wrestling on another. You can do your shopping at home, or play Mortal Kombat with a friend from Vietnam. There’s no end to the possibilities!”
These are the words of Jim Carrey, spoken on a rooftop, during the climax to a film. But Andrew Niccol didn’t write them. They are from 1996’s The Cable Guy, a movie that was not a huge hit with the critics, nor the audiences who saw it. It could better be said to have introduced Jim Carrey as an actor in commercially-underperforming movies. It did not catapult first-time screenwriter Lou Holtz Jr. to Niccol-style acclaim. He was a prosecutor before writing The Cable Guy and creating a bidding war among Hollywood studios, and he never wrote again, instead reactivating his California Bar license in 2000 and returning to the Los Angeles District Attorney’s Office. It’s unclear whether the afore-quoted monologue, which appears twice in the movie, was specifically written by Holtz, who has the sole writing credit on the film but ceded the project to producer Judd Apatow and director Ben Stiller during the process. Unlike most, I found this movie immediately hilarious the summer it was released (Stiller’s dual cameo as the Menendez Brothers/OJ Simpson-like twins Stan and Sam Sweet is unforgettable). I venture that this film holds up, as much as a comedy ever can. If you disagree, try streaming Ace Ventura or The Mask today, which were perceived as amongst Carrey’s superior, funnier efforts at the time. They are not funny today. He is irritating in them. In fairness (or unfairness), future generations will probably take no notice whatsoever of any of the films mentioned in this paragraph, and if they do, it will probably be purely for Carrey’s grossly homophobic caricaturing, and they’ll only take such notice so they can unearth his corpse and posthumously cancel him, a pastime which shows no signs of going out of style.
My appreciation for The Cable Guy has not changed, but one thing that has is my respect for the dialogue I quoted, as well as a handful of other things about Jim Carrey’s socially-bizarre cable installer Chip Douglas. The monologue itself has gone, in less than three decades, from being something exaggerated for comic effect, to being something inevitable, to being something unremarkable because it’s undeniably true. Spectrum, my local cable company, has been trying to upsell me on their “bundle” of phone, internet, and TV for at least the last decade. I currently pay them only for the internet, but nevertheless I receive all of my TV through it and while I lack a landline, it is inevitable that before I die (probably much sooner) my internet provider will be the provider of my cell phone service, or vice versa. Carrey’s monologue even seems to predict, in terms of content, the refined highs and crass lows this new age would mark. Latest series entrant Mortal Kombat 11 was released this year, and rest assured, you can play it with a friend in Vietnam. But more than all of that, Carrey’s Chip Douglas is a weird loner who replaces his inability to maintain real relationships with pop culture references. Today, the world has never been so full of these people. This is like three quarters of what’s on Twitter at any given time. On balance, this forgettable movie shines much more light on what has defined the subsequent years than anything Andrew Niccol brought to the screen.
Of course, The Cable Guy wasn’t trying to predict the future. It doesn’t imagine a whole world apart from our own. It simply highlights one corner of the current world we live in, media, which has come to dominate nearly all others, and speaks with prescient clarity about the implications of where things are poised to go. If there are movies that predict the future in any way worth paying attention to, it is ones like these. While The Cable Guy is meant to make you laugh rather than think, and such predicting is inadvertent, there is another sort of movie that highlights what we should notice in our present as being important for the future head on.
When I contemplate historians examining our age in 100 years, there is one question I have about how they’ll see things, more than any other. I wonder if they will describe 2019 as a year when America was at war. If the answer seems obvious, I promise it is not. While the US has not formally declared a war since 1942, the easiest answer is to say yes and cite the war in Afghanistan, which began weeks after 9/11 and has continued for 18 years, making it an inevitability that an American will soon die there who was not born when the World Trade Center fell. But therein lies the complicated part. 18 years is a lifetime, however short of one it might be. At this point, the war in Afghanistan has become far and away America’s longest war. At what point will it stop being an exceptionally long war, and just become the status quo? It may end, and that end may even come this year. It seems unlikely, but recent developments like the on-again-off-again possibility of an agreement with the Taliban have made an end appear likelier than it has been in a while. But even if it is capped at 18 years, how much will change about American military conflict abroad? The military is engaged, and has been engaged, in combat situations in countries all over the map on an ongoing basis in recent years. These include Afghanistan, Iraq, Syria and Libya. Those conflicts have all spawned a lot of news. But there are countless others that rarely warrant mention. Some of these are reported to be actual war zones where our military functions purely in an advisory capacity, but anyone who saw this movie the first time, when it was called Vietnam, always takes that designation with a grain of salt.
In October of 2017, an ambush killed four US Soldiers in the African nation of Niger. The attack by Islamic militants was the deadliest one, up to that point, in Donald Trump’s young presidency, and when he did not make a statement about it for 12 days, criticism in the press and elsewhere began to mount. When a reporter asked about it during a press conference, Trump made a rambling statement that, while probably not designed to escalate the controversy, probably couldn’t have been better designed to do so. Effectively, Trump congratulated himself for communicating with the families of the deceased service members, and took unveiled shots at his predecessors (Barack Obama by name) for not having done so, or not having done so as perfectly as he. What he meant to say specifically is arguable, but to call the broad point that he was claiming to be better than Obama on this issue is pretty plain. The remarks were immediately criticized for being inaccurate and inappropriate. Americans who noticed the controversy seemed to immediately take their political battle stations. If you liked Trump, he was correctly highlighting a difference between himself and Obama and anything otherwise was the words were being taken out of context. If you hated Trump, he was a petty embarrassment to the Presidency, who proved by the untrue words that he was not qualified to succeed Obama. Voices on both sides of the mainstream politicized the controversy in the following days. However, and this is what is notable, any controversy about the appropriateness of a military presence in Niger was quickly relegated to the fringes. Nobody disputed that it was a tragic loss, but the necessity of that loss was not something Trump or any of the myriad other politicians with a hand in foreign policy were held to account for. There were questions about how this could be avoided in the future and demands for an investigation by John McCain and others (which did occur), but the idea of avoidance of such deaths by avoiding war in Africa was not elevated to a position of any meaning. By any analysis, Trump’s press conference was more controversial than the military action that was its catalyst. That a thing like this would happen, in a country many Americans have not heard of, let alone are not aware American soldiers are serving in, was and is the status quo. Republican Senator Lindsey Graham warned to expect “…more, not less…” American military operations in Africa as the War on Terror begins to “morph”, and that work begun on the continent by Obama would be expanded. Democratic Senator Richard Blumenthal said of the lost personnel, “I could not look those families in the eye,” but he wasn’t talking about sending their deceased loved ones somewhere questionably appropriate to sacrifice their lives. He was instead chalking the catastrophe up to a shameful failure to provide sufficient intelligence that could “…enable them to be successful in their missions…” By the voices that mattered in both parties, it was taken for granted that those missions were above reproach.
The foregoing being the case, for the question of whether this will one day be regarded as an age of war for this country, does it matter whether the approximately 14,000 US troops presently in Afghanistan are there or not? They represent fewer than a quarter of US troops currently deployed just in the region, and substantially less than a tenth of all US military personnel currently stationed abroad. The fate of the “Afghan War,” can be set aside. The real question to ask is whether this scenario, taken as a whole, will be regarded as war. It has existed on a rolling basis since 9/11, and those who take the position that it existed even before that do so in good faith. There is no reason to believe that it will change. It may inevitably change, but there’s no path to that change in sight. So it is wholly possible that in 100 years, those historians will be living in a continuation of this status quo. So what will they call it? Like America’s other endless, abstract war, which is the War on Drugs, the term “War on Terror” will probably eventually be abandoned for policy reasons. President Obama quietly ended the War on Drugs in 2009, but left the prosecution of that war almost the same as he found it. The term was what was objectionable, and I predict the War on Terror will eventually meet a similar fate. But if, absent the name, in 100 years this new status quo or any version of it persists, it will be only that. Ongoing combat operations in a far away theater will not automatically equate to war, as the term has historically been understood.
It may be a question of scale that allows these wars to continue. It should be noted that the loss of four Americans in a single incident in 2017, while tragic, was happening with regularity when Afghanistan and Iraq were at their heights. At that time, the country was widely perceived (the point wasn’t argued) to be at war, and if there wasn’t quite enough opposition for the tastes of those so opposed, it was a vocal component of the mainstream. President Obama’s early opposition to the Iraq War, a minority position at the time of the 2003 invasion, was in no small way a part of his appeal as a candidate for the Democratic nomination in 2008. That he became President, and that by the time he left office something like Niger was noteworthy enough to raise eyebrows, should signify a major accomplishment for those who consider themselves to be anti-war. If America was still at war, it was undeniably less so.
Perhaps when asking this question in 100 years, historians won’t primarily concern themselves with the conflagration of relatively small, unstable countries where these limited firefights are happening. They may instead look at the broader trends, the world powers, and a very untraditional sort of war.
Earlier I said there are other movies like The Cable Guy, ones which point out to the viewer something important to see about the world for the future, but do so not in inadvertent comedy. Instead, these films showcase change that is happening in the present with earnestness. If one were looking for films like these which bear on the question of where American war policy is going, the two most prescient films I have been able to find are both (and the only) collaborations by screenwriters Lawrence Lasker and Walter F. Parkes, WarGames (1983) and Sneakers (1992).
In June of 1983, MGM released director John Badham’s WarGames. If you saw it at that time, or for several years thereafter, you may recall WarGames being the first piece of entertainment in which you encountered the concept of computer hacking. The film was ahead of its time enough that the word doesn’t appear in it. While “hacking”, used in this context, can be traced back to at least the 1960’s, the word was not in the popular lexicon, particularly not before the 1984 publication of Stephen Levy’s history of the pursuit, Hackers: Heroes of The Computer Revolution. In WarGames, we only know the hacker protagonist (Matthew Broderick) as David Lightman, a suburban Seattle high school student whom, although smart, is without much interest in his classes. He’s much more engaged with the computer in his bedroom, and the attached device (and here’s what most people hadn’t seen before) that allows the computer to utilize the phone to call other computers, and communicate with them. There’s no talk of the internet in this movie, even though an audience seeing it today would immediately see it as primarily about the internet. David is primarily concerned with using the procedure to find new games to play, but he engages in a fair amount of seemingly harmless mischief as well. When he learns about a gaming company with exciting new games soon coming to the market, he makes it his mission to dial various numbers where they are located hoping he can gain access to their computer and sample the games ahead of time. When he finds he finds an ominous menu of games, he can’t believe his luck, and selects one called “Global Thermonuclear War.”
Of course, it isn’t the gaming company he’s reached, but a highly advanced government computer named Joshua. And the game isn’t merely that, because this computer has access to America’s nuclear arsenal. David tries to abandon what he’s begun, but Joshua can’t. The fate of the world is suddenly what’s in play.
The 1980’s were the hottest time in a generation for the Cold War. In the ever-reliable prism of retrospect, this is hard to believe—Russia was not as powerful and eager to fight as they seemed, and neither was Ronald Reagan. It may be hard for people born subsequently to visualize, but if you were alive then, or if you now get your history primarily by using Phil Collins videos as primary sources, a nuclear conflict that would level the world seemed entirely plausible. WarGames is often comedic, but in its presentation of Cold War tension, it’s deadly serious. The film opens by showing us an underground command center where nuclear missiles are controlled, and two low-level Air Force officers suddenly met with orders to activate a launch. What they don’t know is that they’re being tested. But neither does the audience. This is an extremely tense moment in a movie with its share of levity. Ultimately, one of the officers (The West Wing‘s John Spencer) is unable to fire the missile because he can’t be sure. This, the command structure comes to be persuaded, is why as much control as possible needs to be put in the hands Joshua’s unflinching AI. While arguments in science-fiction about the ethics of AI are older than WarGames, this very practical application of those questions to present day national security was not something one encountered very often at the summer cineplex in 1983. The Terminator was still more than a year away, and that film doesn’t really ask a lot of ethical questions about the feasibility of AI, it more just asks the audience to take for granted that an AI would be an unflinching killing machine bent on exterminating humanity, without elaborating on why or how. WarGames at least explains the principle of unintended consequences by taking us through the steps of how it might happen. Joshua is not evil. He has none of the menace of pre-Twins or Kindergarten Cop Arnold Schwarzenegger. If he is going to end the world, it’s going to be by accident, which is probably the only way the world was ever going to end as a result of Cold War tensions.
A lot of people saw this movie. It made nearly $80m in 1983 dollars on a $12m budget. It made a lot of those people think about issues of nuclear threat and where society was going with computers. That’s obvious. What is less obvious, and completely true, is that one of those people was Ronald Reagan. Then-president and former B-movie actor Reagan was fond of watching contemporary movies, and he screened WarGames at Camp David the weekend it was released. Reagan was very unsettled by what he saw in the film. Days later, in a White House meeting, the President asked General John Vessey, the Chairman of the Joint Chiefs of Staff, “Could something like this really happen?” When Reagan began to recount the events of the film, eyes rolled. The President might as well have been raising the national security implications of Tron for the 16 senior members of Congress present. But one week later, General Vessey returned with a startling answer: “Mr President, the problem is much worse than you think.” Hackers were real. The national security establishment was not secured against them. It wasn’t a coincidence. To write WarGames, Lasker and Parkes had done their research. Among the people they interviewed was Willis Ware, who for years headed the computer science department at the RAND Corporation, and who had authored a paper (one of many) way back in 1967 entitled Security and Privacy in Computer Systems, which outlines the dangers of remote access to networks with eerie foresight, examining the military dangers, but also the (increasingly important today) dangers to consumer privacy.
It was probably a good thing Reagan chose WarGames over Psycho II or Octopussy, out the same week, because his shot in the dark to General Vessey did lead to real policy changes. There was a significant revamp of cybersecurity at the Pentagon, and Congress subsequently passed anti-hacking laws, the hearings for which included the playing of clips from the movie. Nevertheless, that didn’t stop the future events contemplated in WarGames from becoming very real.
In February of 1998, the Department of Defense was rocked by a number of intrusions to its networks. The attacks were widespread and appeared to come from overseas, occurring at odd hours. Since President Clinton was preparing potential military action against Iraq at the time because of disputes over UN weapons inspections, and some of the attacks originated in the Middle East, the two things were suspected to be related. The government launched a multi-agency investigation codenamed “Solar Sunrise.” A 24-hour watch was set up to catch the intruders, and when it worked, the government was surprised to find: David Lightman. The intrusions had been committed by two high school students in Northern California and a 19 year-old hacker in Israel. The odd hours of the attacks coincided with school dismissal time on the West Coast.
Nuclear close calls and computer errors were also very real, even if we didn’t know it at the time. On September 26, 1983, while WarGames was still in theaters, a Soviet satellite early-warning system reported the launch of up to six American nuclear missiles. Tensions were high, and the possibility of an American attack real, as the Soviet Union had three weeks-prior shot down Korean Airlines flight 007 from New York to Seoul, when it drifted into Soviet airspace. Nevertheless, Lieutenant Colonel Stanislav Petrov of the Soviet Air Defense Forces decided the alert was most likely a false alarm and did not report it to his superiors, in defiance of orders, just like John Spencer’s character in WarGames. It was subsequently determined that the Soviet early-warning system had been triggered by a rare alignment of sunlight on high-altitude clouds. The incident was not publicly known until the 1990’s, and Petrov was honored by the UN in 2006 for having possibly averted a nuclear war.
Sneakers, almost a decade later, was less-noticed than WarGames. While its overall grosses were in line with those of its predecessor, it was easily outshined at the box office by fare like Batman Returns and Unforgiven when it was released on the then-unremarkable calendar date of September 11, 1992. Though this is a very commercial comic “caper” action movie, it probably suffered from having a leading man who was fast approaching 60 years-old at the time. Robert Redford (born 1936) portrays baby-boomer Martin Bishop (née Brice), who has been on the run and living under an assumed identity since narrowly escaping police in 1969 while then a (presumably not 33 year-old) college undergraduate hacking the Republican National Committee and the Federal Reserve. Bishop fled the scene in Volkswagen van, leaving behind his partner in crime Cosmo to face the authorities, despite the fact that people born in the mid-1930’s didn’t ordinarily drive Volkswagen vans unless their name was Charles Manson. But the discrepancy over Redford’s age (presumably the interest by a star of his calibre in this project was more important) is easy to forget once the movie gets going. In modern day, Cosmo has died in prison, and still-hiding Bishop commands a crackerjack team of “sneakers”, security professionals with shady backgrounds who are hired by businesses to infiltrate their operations and expose vulnerabilities. This industry exists in the real world, and is a form of white hat (non malevolent) hacking, often referred to as “penetration testing.” In the film, the team is populated by a stellar cast of supporting players which includes Sydney Poitier, Dan Aykroyd, River Phoenix, and David Strahairn, all with some exceptionally fleshed out personalities for a movie in this normally uninventive genre. Bishop is retained by agents from the National Security Agency to steal a device, the invention of a scientist (Donal Logue), which the NSA explains has been created for the Russians. If the operation is a success, the NSA will not only pay $175,000 (screenwriters endowed rewards so much more modestly in 1992) but also clear Bishop’s (Brice’s) name in his old case. When Bishop pulls off the caper, and the team returns to the office to celebrate, they discover the device is more than what they were led to believe. It is, in fact, capable of bypassing encryption and allowing access to seemingly any protected data in the world. Soon, Logue’s character has been murdered and it’s revealed their client isn’t the NSA at all, but instead a still-living Cosmo (Sir Ben Kingsley), whose plan was not only to secure the device but attain revenge.
There are a lot of levels to what’s happening in Sneakers, and how it foretold future events. For one, many people in 1992 were unaware the NSA existed. The agency, which historically some in government had only half-comically called the no-such-agency, had not (then) been the subject of the myriad successes (and scandals) for which the more operational CIA and FBI were already known. The NSA’s dominion, in the intelligence community, is communications, and communications were much less important in 1992, when most people had never heard of the internet either. That changed with the 1990’s rise of the home computer and later cell phone, and particularly with the general expansion of the national security state that occurred in the years after 9/11. But if the public failed to notice the prescience of Sneakers when it was released, the same couldn’t be said of the NSA.
“The world isn’t run by weapons anymore, or energy, or money, it’s run by little ones and zeroes, little bits of data. It’s all just electrons… …There’s a war out there, old friend. A world war. And it’s not about who’s got the most bullets. It’s about who controls the information. What we see and hear, how we work, what we think… it’s all about the information!”
These are the words of Ben Kingsley, also on a rooftop, spoken to Robert Redford in the climax to Sneakers. One of the people who went to the theater and saw this movie in 1992 was John Michael McConnell. “Mike” McConnell was no ordinary audience member, or at least not ordinary as a member of the audience for this movie. He was the recently-appointed Director of the National Security Agency. And rather than take offense to Lasker and Parkes’ depiction or reference to his agency in the film, he found it extraordinarily insightful, and Kingsley’s monologue gripping. McConnell advised all of his coworkers to see Sneakers, and he even obtained a copy of the film’s last reel and screened it for the agency’s top officials, telling them that this was the vision of the future they should keep foremost in their minds.
But what happened directly in national policy as a result of WarGames and Sneakers were mostly still background events. The two American wars that followed 9/11 in Afghanistan and Iraq were more or less traditional, revolving around invasions by troops and live fire. But beneath the surface, a war, perhaps even a world war, much more like the kind Kingsley spoke of was beginning.
Today, the United States is in a major war with at least four other countries. But with one emerging exception, no shots are being fired, and very little discussion is had publicly about the nature and scope of this war. The main belligerents are China, Russia, Iran, and North Korea. There are others, and virtually every nation on the planet has some involvement in this war, but these are the ones who have attacked American targets most directly. If you picture David Lightman when I mention computer hacking, you’re both right and wrong. Many of the participants in this war are not unlike him. However, the attacks they are engaged in are not mischief, and the danger is not of a misunderstanding like that character had with Joshua. But the worst-case scenario could one day be on par with that which is narrowly averted in the final minutes of WarGames.
In 2008, a senior Chinese diplomat contacted the presidential campaign of Senator John McCain to express his displeasure. The official was miffed about a letter from McCain to newly-elected Taiwanese President Ma Ying-jeou, in which candidate McCain had pledged his support for Taiwan. The call from China left McCain Asia policy adviser Randall Schriver surprised, to say the least, because the letter in question had yet to be sent. “He was putting me on notice that they knew this was going on,” Schriver would later tell NBC News, “It certainly struck me as odd that they would be so well-informed.” While the Chinese would later deny it, the truth was that McCain’s campaign had been hacked by the People’s Republic, as had Senator Barack Obama’s. China’s goal was to export massive amounts of data from the campaigns, including policy papers and personal emails. These were just two attacks by China noteworthy because of their political gravity. China has conducted countless cyberespionage intrusions into American businesses for the purpose of stealing industrial secrets.
The 2008 attack, widely revealed by 2016, should have (but didn’t) put candidate Hillary Clinton’s campaign chair, John Podesta on notice to be extra vigilant. During that campaign, Podesta fell victim to a spear-phishing attack. Phishing is an attempt to get a victim to give up personal information. Spear-phishing involves targeting a particular victim. A very ordinary phishing attack involves clicking a link in an email, which brings the user to a site they are tricked into believing is one that they trust. In the Podesta attack, it was his personal Gmail account that was compromised, with the hackers having been able to steal his Gmail password because he, wrongly believing himself to be logging into Gmail, willingly gave it to them. The hackers then stole thousands of emails, which they would use over the course of the final days of the 2016 election in an attempt to humiliate Clinton and aid Donald Trump. Subsequently, after a study of the attack, the intelligence community and independent security experts laid the blame at Russia’s door. Specifically, the collection of Kremlin hackers commonly called “Fancy Bear.”
Some attacks have been against private, non-government actors, but nevertheless for political purposes. In February of 2015, a massive cyber attack was launched against computers belonging to the Las Vegas Sands chain of casinos. The company lost access to email and phones, and hard drives were wiped clean at a rapid rate. While customer credit card and other data was stolen, the target of the attack was not selected at random. In October of 2013, Las Vegas Sands founder and CEO Sheldon Adelson had said during an event at Yeshiva University that the United States had to take a hardline position to combat the emerging nuclear threat in Iran, going so far as to advocate the detonation of a nuclear bomb in unoccupied desert as a show of force. Adelson subsequently claimed he had spoken in hyperbole, but his comments did not go unheard in Tehran. The Ayatollah Khamenei responded that, “If Americans are telling the truth that they are serious about negotiation, they should slap these prating people in the mouth and crush their mouths.” The cyber attack, which caused about $40m worth of damage, was traced to Iranian hackers connected to the nation’s Revolutionary Guard Corps. The hack came with messages targeting Adelson left on computers, such as “Damn A, Don’t let your tongue cut your throat.”
In November of 2014, film studio Sony Pictures was hit by the blistering public release of internal, confidential data from a group identifying themselves only as “Guardians of Peace.” The release included not only corporate information like internal emails (many of which were extremely humiliating for the studio) but also personal information about employees and their families. Guardians of Peace had stolen the information through a hack against Sony. If there might have otherwise been any confusion about the identity of Guardians of Peace, it was precluded by their demand: that Sony halt the release of The Interview, a comedy starring Seth Rogen and James Franco as filmmakers tasked with assassinating North Korean dictator Kim Jong-un, which was slated to hit theaters the following month. The hackers also threatened terrorist attacks on theaters showing the film. This latter threat was effective in forcing theater owners to decline showing The Interview, and Sony had no choice but to pull the release. However, the studio did distribute the movie online, where it quickly became the most rapidly successful film ever distributed that way. Nevertheless, the film was a huge commercial failure for the studio. During the fiasco surrounding the hack and release of data, actor George Clooney and his agent, Bryan Lourd, sought to stand up to the blackmailers by circulating a petition pledging support for Sony amongst the top echelon of the entertainment industry. Clooney has long been politically active, and Hollywood is a community never shy about showing support for important causes. Clooney believed the hack and demand was an attack on free speech, a particularly important issue for artists and media. Hollywood universally disagreed, or at least refused to agree publicly: not a single person contacted was willing to sign the petition. To Clooney, it wasn’t that they supported North Korea’s actions, but were instead simply not brave enough to risk that they would be Guardians of Peace’s next hacking target. “They know what they themselves have written in their emails, and they’re afraid,” he told Deadline.
These attacks are all shocking and invasive. They are accurately described as acts of global terror. Nonetheless, they were purely attacks on information. Even where there were real world consequences, they were measured in dollars, not lives. This is probably why they were not more newsworthy when they occurred, though all were reported. Comparing them to WarGames’ high stakes of a nuclear war may seem absurd. But hacking by governments (or anyone else) is not limited to information attacks. Much more dangerous, and growing in number, are attacks on industrial control systems. This is the technology that operates things. This is the computer that controls a self-driving car, or a hydroelectric dam, or increasingly, every appliance in the home. A properly executed hack on technology like this could be devastating in lives lost.
As I was writing this essay, Iran and the United States came very near a very hot war. Iran-backed militants attacked the US embassy in Iraq, then a drone strike hastily-ordered by President Trump killed Iran’s top general, Qasem Soleimani. Iran responded by firing missiles at targets in Iraq very near US soldiers, but none were hurt. It now appears likely that hostilities are lessening, and a return to the status quo is the most likely outcome. But listening to the commentary over the week, which often contemplated a World War 3 breaking out, I wondered what the speakers believed that status quo to have been prior to the sudden hostilities. Iran and the United States have never enjoyed peaceful relations in the 40-year modern history of that nation. In a plan at would have surely killed many Americans, Iran plotted in 2011 to assassinate the Saudi ambassador to the US with a bomb at a Washington, DC restaurant, only to have their agents apprehended by the FBI prior to carrying out the attack. As recently as June 2019, Iran shot down a US surveillance drone (the airspace position of which is disputed by both nations) with a surface-to-air missile. Iran is accurately said to be the world’s foremost state sponsor of terror, both in Iraq and elsewhere, and the 2015 nuclear deal (for the brief time it lasted) didn’t bring a solution to these problems. However, when I consider the state of war between the two countries, it isn’t these ambiguous misunderstandings or proxy fights that come to mind.
A decade ago, the United States and Israel launched what is widely regarded as a sea-changing attack in cyber warfare. This was the 2010 Stuxnet attack on Iran’s nuclear program. These countries have never claimed responsibility, but nobody doubts it, and nobody suggests an alternative theory with credibility. Stuxnet was an attack on industrial control systems, which exploited then-unknown vulnerabilities in the Windows operating system; what are called “zero day exploits”, because the software company (here, Microsoft) is unaware of the vulnerability and has thus had zero days to respond to the problem, making an attack almost sure to succeed. Specifically, Stuxnet was a software program that infected computers controlling centrifuges used by Iran to refine uranium capable of being used in nuclear weapons. The computers targeted were not connected to the internet, so Stuxnet was (somewhat brilliantly) engineered to spread via USB drives, from system to system, laying dormant until Stuxnet was sure it was in an Iranian computer controlling this type of machine, then striking. Its attack was subtle enough as to make Iran think its centrifuges were merely defective, and continuously replace them. It appeared to be working, until independent cybersecurity researchers discovered the program laying in wait on unrelated computers, and began to deconstruct it and figure out its actual target and objective.
When Stuxnet was revealed, there was very little talk about the action being a violation of Iran’s sovereignty, let alone an act of war against the country. This is perhaps because of the refusal of the attackers to reveal themselves, but also because preventing Iran from achieving a nuclear weapon was a laudable goal to many in the international community. But as a precedent, Stuxnet was dark one. It cannot be assumed that Stuxnet was the first attack of its kind because, since its revelation was so improbable, there’s no reason to be sure something of this kind hadn’t happened before, and gone undetected. But surely it was the first widely known. The precedent, therefore, is that attacks between nations on industrial control systems are now in-play. Stuxnet’s specific origin from within the intelligence apparatus of the United States is rumored to be Tailored Access Operations, an elite and secretive unit of the NSA.
Stuxnet was limited in its reach, and didn’t target people, let alone civilians. Attacks that have followed haven’t been so reserved. In December of 2015, systems that controlled Ukraine’s power grid were hacked, and hackers were able to shut down electricity to almost a quarter of a million people. This attack, like hundreds of other cyber attacks against Ukraine and other nearby nations before and since, has been widely attributed by security experts in government and private industry to Russia. In a move to respond to the 2016 Russian attacks on America’s election, The New York Times reported in June of 2019 that the US is currently (secretly) deploying similar weapons to gain access to Russia’s power grid.
These attacks are still a far cry from gaining access to a country’s nuclear arsenal like WarGames supposed, but it isn’t unthinkable either. There are situations where merely cutting the power will result in deaths, possibly even on a large scale. Imagine if you could disable primary and backup power to vital life support systems in a hospital. In the future, should space colonization become a reality, such attacks could foreseeably cut off an entire population’s access to life-saving oxygen. In Star Trek, massive ships fire photon torpedoes. In reality, would they even bother, if an attack on a ship’s networks is easier to execute and has a higher probability of success? One day, cyber war may be the only war.
China, Russia, Iran, and North Korea are all major powers. But relative to one another they can seem much smaller, and probably their power decreases by orders of magnitude as you read through that list. But the cyber war is waged on a very equalizing plane. The players are limited only by their cunning. These countries have all struck the United States, and the United States has doubtless struck back (or struck first). Countless other nations have been on both the giving and receiving ends of cyber attacks, but these four seem most intent on (or most effective at) targeting America. At the present time, half the list has nuclear weapons as advanced as any in the world. North Korea claims to have nuclear weapons and is well on its way to making them deliverable. Iran will almost certainly have nuclear arms in the next 20 years. Even if President Obama’s 2015 nuclear deal had lasted, it would have only forestalled their development for 15 years, and five have already come and gone. If Iran does so succeed, they would only be the tenth country on the planet to have the bomb. Make no mistake, the cyber war is a world war amongst the world’s most powerful nations.
If this all seems like a very America-centric view, it is. America hasn’t been the site of actual combat with a foreign nation since World War 2, and even then the fighting was effectively limited to the attack on Pearl Harbor. A person living in a country I haven’t deemed relevant enough to yet mention, who looks skyward to see American drones firing missiles downward, couldn’t be blamed for concluding their nation is in a very real war with the United States. But while that may be the new war-like status quo about American intervention, however sporadic, it isn’t where the real power struggles for international control will take place going forward. Like everything else in our culture, that fight is going to play out online. I don’t know where or when the next 9/11 will happen, but I suspect it will come from a far-away nation, where even now a figure is seated before a screen, patiently plotting, their fingers grazing the keys along with mine.
Lasker and Parkes were not alone in drawing light to the coming information war. Many professionals, the same people who foresaw the importance of the internet writ large, made predictions of this sort. But in the national security context, 9/11 Cassandra Richard A. Clarke warrants mention again. Clarke has spent easily as much time during his prolific career warning of this threat as he has talking about al-Qaeda. In 2000, when Clarke was serving in the Clinton administration, he invited a long-haired hacker named Mudge to a White House summit on web security. Mudge was the lead personality in the Boston hacker think tank The L0pht, and closely identified with the famed hacker collective The Cult of The Dead Cow. At the meeting, the casually-dressed Mudge warned the more formal cabinet members and industry professionals about the danger of how little protection secured the nation’s technological infrastructure. If they, or the President, failed to take Mudge seriously, that wouldn’t always be the case. In the post 9/11 world, Mudge cut his hair, and increasingly used his real name, Peiter Zatko. He is now well-known for the work he subsequently did in developing tools and protocols for the Department of Defense, creating both defensive and offensive cyber security strategies. Thereafter, he went to work at Google.
If Mike McConnell, or anyone else at the NSA, drew inspiration for the direction of the agency from Sneakers, perhaps they were listening too closely. In the film’s final scene, the ‘real’ NSA shows up to collect the powerful decryption device that Bishop and crew have by then stolen back from Cosmo. As if Sneakers needed to add another renowned performer, James Earl Jones appears as a mysterious representative of the agency who wants the technology. Redford, remembering information he received earlier in the movie, points out that the device would be no good to use against the Russians, because “…their codes are completely different than ours.” This has the effect of diffusing any speculation on my part as an audience member that the technology of the movie might by some theory by plausible, but that isn’t way the scene is worth noting. The only thing the device would be good for, the Sneakers surmise, “…is spying on Americans.”
In 2013, NSA contractor Edward Snowden revealed, in what many had suspected since the War on Terror began, but perhaps no one outside of the government had comprehended the far-reaching scope of, that the agency (in cooperation with other intelligence agencies, the US’ Five Eyes partner nations, and several large communications corporations) was engaged in mass surveillance so pervasive as to allegedly touch the lives of every American. While the government’s version of the story and Snowden’s are not a precise match, no one disputes that the tens of thousands of documents he revealed are genuine. Many anti-surveillances crackpots have made startling claims about the surveillance state since the War on Terror began, but far fewer have been indicted for espionage, like Snowden has been, though he currently enjoys (though it is unclear how enjoyable it is) asylum in Russia thanks to Vladimir Putin. That being the case, it is hard to ignore how chilling his descriptions of the NSA’s operational capacity can be. “I, sitting at my desk, [could] wiretap anyone, from you or your accountant, to a federal judge or even the president, if I had a personal email,” he told The Guardian‘s Glenn Greenwald in 2013. Only Sneakers’ imagining that a highly advanced super-device would be required seems far-fetched, in light of Snowden’s disclosures.
That isn’t entirely true, and the NSA’s power over information isn’t absolute. Some things are genuinely encrypted such that prying eyes can’t see them without a key, so far as we know. But one could easily end all of those statements with “yet.” Much of the world’s encryption technology is not based on the creation of a puzzle that is impossible to solve, but rather one that would take a traditional computer a seemingly infinite amount of time to solve. These encryption processes use “trapdoor” mathematical functions. As MIT Technology Review recently put it:
Trapdoor functions are based on the process of multiplication, which is easy to perform in one direction but much harder to do in reverse. For example, it is trivial to multiply two numbers together: 593 times 829 is 491,597. But it is hard to start with the number 491,597 and work out which two prime numbers must be multiplied to produce it.
This is where Sneakers’ super-device starts to become more plausible. Enter: quantum computing. Quantum computers are highly untraditional, probably still theoretical, or at least prohibitively impracticable, computers of the future that I won’t try to pretend I understand. A quantum computer “thinks” in a fundamentally different way than any computer you’ve ever used. But the idea that you might be carrying a future iPhone with a quantum computer onboard seems totally out of the question; a quantum computer requires enormously precise conditions to function. The point is that these very advanced and powerful computers might eat through today’s encryption the way the “black box” in Sneakers did. If one accepts that, as Sneakers suggested and the NSA clearly believes, the information war is the war of the future, the quantum computer starts to look a lot like the atomic bomb used to. But only time will tell how real a quantum computing revolution might be, and quite possibly that will take more time than I have left to look forward to on the planet.
Lasker and Parkes haven’t written a movie since Sneakers, nearly 30 years ago, and it is the only film other than WarGames either is credited with writing, though both developed the Best Picture Oscar-nominated Awakenings in 1990, and are credited as its producers. While Parkes has produced over 50 films, such as Men in Black and Minority Report, and ran DreamWorks for several years, Lasker doesn’t have a screen credit after Sneakers. Despite their divergent paths, the two appeared on a panel at Google in 2008 to discuss WarGames, upon the event of its 25th anniversary. In discussing the genesis of the movie, Lasker explained that he was inspired to write a movie about Stephen Hawking, and the possibility that Hawking might one day discover a unified theory of physics, but he might not be able to communicate the breakthrough because of his debilitating ALS. He and Parkes imagined a scientist like Hawking being paired with a juvenile delinquent, and how the two might understand each other. Both characters exist in the final film, though clearly they evolved a great deal from the original concept. The technology that the actual WarGames is known for was something incidental to Lasker and Parkes; a curiosity they ran across as they attempted to create an environment in which to set the characters they were already envisioning. It was a research trip to Stanford that led them to Peter Schwartz, a futurist who elaborated to them what was then emerging in tech, and appeared alongside them on the panel in 2008. As Parkes put it at Google, to write the other way, to start with the technology as your endpoint and then try to create the characters (which the pair have on other occasions attempted) is a “difficult” and “…often self-defeating process.” Describing the methodology of their research, which led to the technology seen in WarGames, Lasker added, “Let’s find the real story instead of trying to make it up, because it’s usually much more interesting than you can imagine.”
I think that this is the problem with Andrew Niccol’s movies, and what separates them from the work of Lawrence Lasker and Walter F. Parkes. In Gattaca, The Truman Show, and Simone, characters seem to exist to facilitate the story about technology that Andrew Niccol wants to tell. But it’s always fundamentally a story about that technology (more than it is about the characters), and that technology is based on what he can imagine, which, like Lasker said, is never going to as interesting as the reality. Instead, those filmmakers were mainly concerned with the mechanics of their characters and story. As to which technology made it into the script, and how, Parkes said at Google, “All of our research was done in a context—it was the context of where we wanted the story to go, as opposed to saying ‘how does it work?'” This is why characters in WarGames and Sneakers are still vivid decades later. When, in WarGames, David Lightman uses his computer hacking skills to change Jennifer’s (Ally Sheedy) grades, it seems completely real, because it is completely real (now, and when I was a teenager in the 90’s) that a nerdy kid would try to leverage his ability on the internet to impress a girl. When Sneakers’ Martin Bishop embarks on a very risky mission to steal the black box, it’s nevertheless plausible because the reward being offered is a reprieve from the life on the run he’s been living for most of his adult years, and he will soon be (or he already is) an old man. Contrast these very human stories with the unnatural environs of Niccol’s work. In Gattaca, why would a genetically-perfected class need to oppress their unremarkable counterparts in some kind of apartheid state? In The Truman Show, why would a modern day audience watch a 24-hour a day television show that’s tailored to be a facsimile of 1950’s sitcoms? In Simone, why would the entire world fall stupidly in love with a celebrity who is utterly without charm? These three questions are impossible for Niccol to answer, but because his films lack concrete or interesting characters, and are only about the technology, the effectiveness of those movies rests wholly on those questions. But the technology isn’t interesting as a distraction anymore. The movies, today, wholly fail to hold up.
After Simone, Andrew Niccol was not relieved of his command in Hollywood. He’s kept on making competent movies. He followed Simone with Lord of War, which, if The Truman Show introduced Jim Carrey in serious movies, is one of the films that served as notice of Nicolas Cage’s departure from them. The film, a mostly unserious effort from 2005 about an arms dealer that nevertheless attempts to be both tragic and topical in its third act, seems especially based on a pre-9/11 worldview. But it was a modest critical and commercial success, which also describes Niccol’s overall career. 2011’s In Time even grossed $174m on a $40m budget, but its story of a future where people only live to age 25, and time is bought and sold, suffers from many of the same criticisms I’ve outlined against Niccol’s other work. Niccol was quickly sued over In Time by the science fiction writer Harlan Ellison, alleging the idea was purloined from his work. Its main strength was starring Justin Timberlake and Amanda Seyfried, which doesn’t speak highly of Niccol’s craft. Last year he released (via Netflix) Anon, yet another vision of a dystopic future, this one being where people have a continuously recording camera in their eyeballs, and a detective (Clive Owen) must stop a killer who has hacked the technology to cover their identity when murdering. While Anon’s financial viability is known only to Netflix, critics (and users who spoke via RottenTomatoes) didn’t approve, and the film languishes there today with an overall rating of 38%.
I thought Anon was a perfectly competent sci-fi detective story. It wasn’t anything exceptional, although the visuals were very well done. If there was a reason to care about the characters, or what their goals were, it escaped me, but I wasn’t bothered enough to turn off the movie. If history is any predictor, I won’t need to write critically about that fact until some time around 2038. The most glaring flaw the movie had, to me, was that it wasn’t an original idea. This idea had been better executed than Niccol could have hoped to match in a 2011 episode of the British sci-fi anthology series Black Mirror.
Like its prior two failed rebirths, the emerging verdict on Jordan Peele’s The Twilight Zone seems to be a negative one. Even amongst those who like the series, nobody is suggesting it could hold a candle to Rod Serling’s original. If Serling’s five-season masterpiece has a cultural inheritor on present day TV, it is actually Charlie Brooker’s Black Mirror. Black Mirror was created for England’s Channel 4 in 2011, but moved to Netflix in 2015, where it has resided since. This is why the latter entity’s underwriting of Anon seemed especially odd. Black Mirror, in its first season, aired an episode called The Entire History of You, about characters who had eye implants like those in Anon. That episode, however, doesn’t utilize the tech to tell a clumsy detective story with a predictable conclusion. Instead, it’s a cringe-inducing examination of human jealously and the sometimes irresistible urge to learn things we are likely better off not knowing. It’s an extremely human story, and one which most adults will find relatable, even if they are much better than the protagonist at suppressing this particularly self-destructive drive. All of Black Mirror’s stories are about technology, in one way or another, but they are equally about people, and what (their creators imagine) people would do if confronted with those theoretical technologies. Again, Netflix’s numbers (with the rare exception they choose to publicize) are their own, but all indications are that Black Mirror has become extremely popular for the genre. It’s been airing new episodes for nearly a decade, and on RottenTomatoes, its average score from audience and critics hangs above 80%.
There’s no utility to technology without human beings. When it comes to a theoretical technology, we are interested in what it means for our actual lives. We want to know what it will make better—diseases we can cure, lifespans we can extend, planets we can travel to, machines which can protect us, etc. Or worse—all of these same things, from another point of view. Any worthwhile examination of technology necessarily revolves around the human element. While sci-fi has sometimes examined a future earth without people and where the planet is taken care of by robots, this would actually be a pretty pointless technology. Nature doesn’t really make value judgments as between organisms. If human beings were extinct, and these robots were running around preserving the environment, it would just mean human values were being preserved, rather than some objective natural value. You may think the spotted owl is worth preserving, and the spotted owl probably agrees with you, but I assure you that if you are instead a nearby red-tailed hawk who will die if you don’t eat soon, the calculus is very different. Even if the value judgment is as simple as, “I choose living things over non living things,” there’s no possible earth best-suited to living things. Extremophiles are microorganisms that live in the most extreme conditions. Recently, such life has been found thriving in Pitch Lake, the largest natural lake of liquid asphalt in the world. If the worst climate change apocalypse currently imaginable does come to pass, there are some living things that will be the big winners, at the expense of very complicated things like ourselves. Whether that’s better or worse than the world as we know it now is totally within the (human) eye of the beholder. So technology doesn’t have much application beyond our own lives, even if in our worst nightmares (and in a very possible reality) it is the cause of the end of those lives.
Nobody can predict the future. Especially now, when the future is moving so quickly. The best any filmmaker can do is predict that future by accident. But the timeless qualities of people, be those qualities ugly or beautiful, can be examined in art quite well. This is all that even the best movies can hope to achieve. As a teenager, I went to the cineplex to see a bold vision of the future. 20 years later, I go there to see a bolder vision of myself.
Here in my car
I feel safest of all
I can lock all my doors
It’s the only way to live
–Gary Numan, “Cars”, from the album The Pleasure Principle, 1979
I love cars. They are perhaps my favorite material good. I like driving them. I like putting the radio on. I like putting my window down. I enjoy the power and style of the cars I like, and exploring the world from behind the wheel. I never suspected that a car would be the thing that changed my life forever, with extreme and permanent consequences.
I always felt somewhat this way, growing up playing with Hot Wheels and Matchbox toy cars. But this interest was casual, and one of many. I put away those childish things as I got older, and my interest in cars matured and became a real appreciation when I was a teenager. I came then to like my cars enormous, American, and ancient. Or at least, as a 16-year-old, what seemed ancient to me. Many of the cars I liked were only 10 or 15 years old. However, it just felt like manufacturers were not making vehicles in the late 90’s that appealed to me. Mechanically, the car should be rear-wheel drive, with an eight-cylinder engine and body-on-frame construction. If these cars were slow, they were nevertheless strong—you might get beat off-the-line by many vehicles on the road, but it was still real smooth and easy to pass someone on the highway. I was drawn to a boxy design with too much chrome trim on both the inside and outside. I loved fake wood paneling on the interior trim of a car and sometimes, as in the case of a 1989 Buick LeSabre Estate Wagon I drove in high school, wood on the outside too. I liked Cadillac Fleetwoods, Ford LTDs, Lincoln Town Cars, Oldsmobile Delta 88s, and so many more behemoths that spend their time today existing in junkyards or photographs. Photographs mostly. The junkyards recycled the vast majority of the extant ones years ago. Mechanically, the car should be rear-wheel drive, with an eight-cylinder engine and body-on-frame construction. If these cars were slow, they were nevertheless strong—you might get beat off-the-line by many cars on the road, but it was still real smooth and easy to pass someone on the highway.
I tried to learn to work on cars as a teenager, but I lacked the patience. Much of how cars worked and how to properly maintain them seemed beyond my ability to understand. More than a lack of comprehension, my problem was distraction: in high school, I was drinking. I drank as much as I could get, every chance I got to drink. From the first hint of that unparalleled ease that comes for some (like me) with ingesting ethyl alcohol, I wanted to reach out into the world and scoop up every bottle and can it contained and then dump them down my throat. Really. If you saw me drunk, I was probably too drunk. However, I was a shy kid and the opportunities to drink could be limited or infrequent. This sometimes drove me crazy at the time, but in retrospect I know my inability to access alcohol kept me safe as a teenager. Despite my overindulgence in booze, I never got in real trouble.
But even if I couldn’t fix a car, I loved driving. From the first time my right foot touched the gas pedal, I wanted more. When I got my license, I drove everywhere I could. Sometimes I would drive all day long. It’s important to remember that in the summer of 1999 the price of gas was on the decline. It fell closer and closer to below one dollar, and we waited in tense anticipation of it getting there. Even with those big gas-guzzling V8s I drove, it was financially manageable to drive around for fun. More than fun, it became a sort of mission as I became an adult; a compulsion to get behind the wheel and travel in a direction whose endpoint is unknown. That mission has now led me to visit 49 US States (you can’t drive to Hawaii—yet).
There was also an endless history of songs by people who loved cars as much as I did. There was the afore-quoted, eponymous “Cars” by Gary Numan. Perhaps more eponymous was the band named The Cars. T. Rex sang about cars frequently. So did George Thorogood. For some it was not love, but lust: Bruce Springsteen urged Wendy to “Just wrap your legs ’round these velvet rims and strap your hands ‘cross my engines.” Songs about cars didn’t even have to articulate in what they were trying to say. I couldn’t tell you what “Baby You Can Drive My Car” means, but I liked it.
Here in my car
I can only receive
I can listen to you
It keeps me stable for days
But with adulthood, other things were becoming a compulsion too. Namely that old, unstable compound that could explode at any time—my drinking. After I graduated high school, I graduated to a more substantial way of using alcohol. I socialized closely with a lot of people over 21 so I would have all the access I needed. Once I was able, I began to drink every day. For most people, drinking is usually an evening adventure, but at age 19 I found myself drinking a six-pack of beer in the morning just to get the day going. Sometimes I would wake up in the middle of an ordinary night and find myself needing to put alcohol in my body, which need I gave into. Advanced drinking began to create advanced problems. There were blackouts and hurt feelings, and episodes of drunkenness marked by irritability or maudlin sadness. By the time I was 20, I found myself without my family or many friends speaking to me. I was unemployed. I was uneducated. Overall, I was unable.
Once the pain of how I was living got to be bad enough, and without knowing where to turn, I dragged myself to Alcoholics Anonymous. I found the program and people to be very helpful for me, but I was generally unwilling to take their suggestions about how to stay sober, and I was soon back at the bottom of the bottle.
In a few years’ time, those bad consequences of my drinking only grew. I had entered college and maintained my grades, but my personal life and mental stability were spiraling down the drain in a stream of high-proof liquor. At 24, I entered rehab for the first time.
It took about a year, but at 25 I finally gave in and took the simple advice AA had always offered. And just like they told me—it worked. I was sober for six months. Then a year. Then another. My life improved, and greatly. I graduated from a good university Magna Cum Laude, and had a good job that I cared about. This, for a guy who a few years earlier was a chronic dropout and no-show. Life wasn’t perfect, but seemed to improve so much with each passing year that it didn’t strike me as crazy to aspire to what I thought of as a perfect life.
And I still loved cars. In my drinking, I had lost interest in the hobby, and now it came back. I bought and sold several of those cars I liked. Even though I was never going to be a mechanic, I learned to do a number of important things on a car. I changed my own oil. I did my brakes. I learned to replace (the more simple) broken parts with those from the junkyard. The biggest help was YouTube, which had thousands of hours of people working on their cars, and explaining step-by-step how it was done. I was happy with the progress I was making with cars, but also with myself as a person, and that intangible asset is truly one of the most valuable feelings a person like me can have.
I came to feel safe in the belief that it would last forever. This is never what a person should think. Some things do last for what we perceive as forever—the course of our own lives—but nothing in life is guaranteed and anything can happen at any time. Over the years I encountered a number of defeats that chipped away at my belief in myself. There was a heartbreak. Then a career that didn’t work out. Then a creative project that fell apart. I began to see myself differently; I pictured myself as someone chronically unable to achieve my goals. At this time, I also began to drift further and further from AA and the things I had learned there.
On July 20, 2014, after six continuous years of sobriety, I drank. This story is, if nothing else, a cliché. Millions of people with an addiction have, over the course of history, falsely told themselves that this time will be different. Only it never is. We alcoholics who achieve sobriety only to find ourselves with a drink back in our hands are like those career criminals in the movies who believe they can pull one more job, this one to retire on, only to be met with tragic consequences. So naturally, this is what happened to me. The mess I had been in my early twenties announced its return, and this time it was stronger.
So there were more rehabs. Each time I wanted desperately to be sober, but I couldn’t pull it off for more than a few months. Ultimately, it was a recurrence of an earlier problem: I was unwilling to take advice and do what I knew I had to. So I didn’t get better.
By the Spring of 2017 I found myself in a sober living home in LA. I began to do AA earnestly again and at 30 days sober I was feeling very optimistic. What I didn’t know was that my life was about to dramatically change forever. At 2am on May 18, 2017 I was awakened by my roommate. He explained that a girl from work had driven him home, only to have her car battery die. I was exhausted. I did not want to get up. But I know how it feels to be stranded, and if I could help another person (stranger or no) avoid that feeling I was going to. So I jumped her battery and we wished her luck. But minutes later she called. The battery had died again a few miles away. So we got into the 2010 Ford Mustang GT I was driving at the time and traveled to the scene.
Now that GT was a beautiful car. It wasn’t my usual style, but there was so much I enjoyed about it. The Mustang was powerful and nimble, and I thought it had a real sharp look. More than that, I was proud of the great deal I had negotiated when I bought it used. A lot of my vanity was wrapped up in that car. This would create a powerful irony with what came next.
We found the girl’s car, a Chevy Trailblazer, on a quiet road. It was actually a school zone. There wasn’t a soul in any direction. Cars were not passing. It was truly the dead of night. We began to connect the jumper cables.
But we were interrupted by the greatest force I have ever felt. It was a sudden pushing of my whole body at a speed I couldn’t comprehend. There were sparks, but that was all I remember seeing. In an instant, I was on the ground, on my back. I couldn’t move hardly any of myself, but I knew with lighting speed that the problem was in my right leg. My driving leg. What I felt in my heart at the time wasn’t important, but it would be later. In retrospect, my back was the best place I could be. A bone from my lower leg was actually poking out through my calf, and I am grateful to have been spared setting my own eyes on it. What’s more, I was bleeding. A lot. I didn’t know it in the shock of what had just transpired, but time was of the essence. People bleed out in situations like mine all the time.
My roommate took the same impact I did, only on his other leg (because he was facing the other direction). I couldn’t see or hear him. After telling me she was calling 911, the girl ran to find him. I couldn’t see her, but the blood-curdling scream she let out was inescapable. It felt like being on the wrong side of the screen in a horror movie. The noise felt like it shook the world, after the world had just been badly shaken. In that moment, I was sure my roommate was dead.
Then the sudden tragedy became, as suddenly, surreal. There was a man standing over me shouting swears down at me. Shouting swears in every direction, actually. Having had experience, I could tell from the slur in his voice that he was drunk. Perhaps very drunk. While I didn’t know what happened, it seemed obvious that this gentleman had been the cause. Later it would emerge that this guy had plowed into the Trailblazer’s rear at what the police said was 70-90 miles per hour, crushing my roommate and I between its bumper and that of my car with incredible force. As is often the case in an accident like mine, the drunk man had stepped out of his totaled, brand new Audi without a scratch. Later, I learned that when a crowd had gathered, the man punched my car and attempted to flee. I was lucky in that the good people of that quiet neighborhood barred his exit. Today he is in a California prison. Almost from those first moments, the irony of a drunk like me getting hit by a drunk driver was not lost on me.
Fortunately, the Los Angeles Fire Department is among the best in the world. They arrived quickly and got me to the hospital in just eight minutes. But in the ambulance was when I first began to feel a pain like none I had hitherto experienced. It was agony. A paramedic quickly gave me IV morphine and asked if it was helping. I screamed that it wasn’t. So he gave me more. Nevertheless, the pain didn’t budge an inch. Once I was at the hospital, they tried to ease it again, this time with the famously dangerous fentanyl; a drug with such a nefarious reputation these days that many don’t know it isn’t exclusively a street drug. Even with this heavy-duty solution, it was as though the nerves in my leg were ringing invincibly. There was no relief. There wouldn’t be for a while.
I was quickly informed that I needed to be rushed into surgery. I didn’t understand how bad the damage was, I just knew it was serious. Soon they were putting me to sleep, and I felt such a relief that anesthesia would at least spare me the pain for the time being.
I woke up about 16 hours later in the intensive care unit, still in great pain. A futuristic looking device held my right leg firmly in place, with evenly-spaced four posts drilled into my leg, all the way down to the bones. I was still in intense pain. Slowly, my friends began to show up and explain what had happened. As it turned out, thank God, my roommate was alive. He had simply been knocked unconscious and been badly cut. This, in addition to a severe injury to his left leg similar to mine. I think when his coworker had seen him, motionless and bloody, she must have thought the worst and let out that horrified cry.
When I finally spoke to my doctors, they said the damage had been very bad but I would recover. In the night they had needed to transplant a large vein from my left thigh into my lower right leg, and give me multiple blood transfusions, but I had survived the worst and the prognosis was good. They said there would, however, be more surgery.
Then, a revelation. Why hadn’t the morphine or the fentanyl helped? It turns out I knew. A few weeks earlier I had begun taking a drug called Naltrexone. Naltrexone is an opiate antagonist. If an opiate addict tries to use opiates while also taking Naltrexone, they will find they cannot get high. I was never an opiate addict, but research implies that Naltrexone can also help alcoholics with cravings, and I found this to be to be true. But just a day before the accident, I had been in a conversation with another one of my roommates about how the danger of Naltrexone is that one won’t be able to be treated for acute pain if they are in a sudden, terrible accident. It seemed true to me in that conversation, but nevertheless far away, like a very real thing that could only happen to someone else. If only we had known how prophetic our speculation would be. But in the chaos of the accident, I hadn’t been able to summon this memory and put the facts together. Those first few pained days in the hospital were hard, but the Naltrexone began to wear off and after about a week I was feeling real relief.
I have no doubt my doctors gave me this news in good faith, and that they had every reason from what they were seeing to believe in its truth, but even the most qualified doctors can be proven wrong. About a week later I went under for surgery again, and awoke to another shock: much of the tissue in my lower leg had necrotized (died) and the leg would have be removed just above my knee.
Here in my car
Where the image breaks down
Will you visit me please
If I open my door
I don’t know how I dealt with that news. Maybe the pain meds were working too well. Maybe it was too shocking to fully devour. My surgeon told me it was my choice, and that I could try to save the leg, but there would be no point. I would walk much easier and have a much more functional life with a prosthetic leg. He had seen patients back on their feet (so to speak) in 90 days. Feeling like I had no other choice, I believed him, and to this day I believe him. I gritted the teeth of my will and told him, “Yes. Do it.”
So they amputated my lower right leg. I went under with my whole body, and woke up with a part I had used my whole life gone. The doctor told me there would be pain, and I thought because of my ability to handle the pain from the accident, I would be okay. I was wrong. From almost the moment I woke up I writhed in my bed with unceasing sensations of burning, painful tingling, and crushing. Luckily that writhing was mitigated by the love of my family, and my parents were present, giving me their hands to hold tight when I just couldn’t escape this awful, previously unknown level of suffering.
What I would soon learn was that I was experiencing the beginnings of the phantom I pain I live with to this day. Phantom pain, if you’ve heard of it, is every bit as bizarre as they say it is. It is truly the very real sensation that a missing body part is still there and suffering terrible pain. At that time, I could still feel every bit of what was gone. These days, most of the feeling caused by incorrect signals to my brain indicates my heel, toes, and portions of my foot are still there.
There are the things that one expects when facing an ordeal like mine, but then there are the things that surprise. For me, it was an acute feeling of betrayal. I learned, from a friend, that some friends and acquaintances of mine had come to a consensus about the accident, and they had some nasty things to say. “They should have waited for AAA.” “They brought this on themselves.” I don’t know if that’s true. I guess you maybe bring anything on yourself. But hindsight is—an easy tool for use in judging other people. Kennedy should’ve worn his protective bubble while riding in that convertible, but his strength was in the way he connected with people, and he chose not to. Was it his fault? I say no. The irritability I felt, or perhaps the irony of the whole thing, was compounded by the fact that I knew these people from recovery. Some of them had used heroin and crack for years. Even recent years. I don’t judge anyone for this, but I fail to understand how one who thinks it a reasonable idea to put a dirty needle in their arm can say that it was a bad idea for me to attempt to help a stranded motorist in the night. One of these people had even hit and killed another person with his car while driving drunk, only to avoid prosecution due to a technicality. Maybe seeing it as my fault was the only way to see it, because to blame the drunk who hit me would make him face something about his own past he was unable to. Maybe a lot of things—it’s not worth speculating. But it stung, and while the wound to my feelings has healed, there remains a scar.
I was in one hospital or another for over three months. In Los Angeles, the quality of hospitals varies. I started out at Ronald Reagan UCLA, where the care was excellent. Thereafter I went to a nursing home in the valley. But there, in a place that never seemed very clean to me, I caught a bad infection. I complained that my pain was increasing for days but the staff failed to see the infection, despite their dressing of my wounds every day. When a doctor finally saw how infected my incision had become, her response was simply, “Whoa. That’s infected.” The infection had grown so much in the nursing home that it would soon emerge that it had reached my bone, and I had osteomyelitis.
I ended up in Valley Presbyterian Hospital. This is a hospital I would not send my cat to. Everything I observed implied to me that Valley Pres was a filthy, cramped, old, profiteering dump. I was initially taken to West Hills Hospital, but I had to be moved when Anthem Blue Cross refused to pay for it. Anthem was happy to do business with Valley Pres, apparently. I think health insurers should have to visit and accredit those institutions with which they have agreements. And when something goes wrong, the insurer should be liable to the patient just like the hospital is. I’m happy to tell you I don’t have Anthem now. I advise you to avoid it too, if you can. These are just my opinions about Valley Pres and Anthem, but I can you tell you that I haven’t had trouble finding others who share them.
While my surgeon had seen people fully recover in 90 days, this wasn’t the case for me. Over that first year there were nine surgeries. There were frequent infections. Because of these infections and the recovery time for surgeries, I couldn’t heal completely, and I couldn’t be fitted for a prosthetic leg until my incision had fully closed. Finally, when it had been 11 months since the accident and I felt things were never going to change, the incision healed. By the end of April 2018, I was walking on a robotic-looking titanium prosthetic. It was, in a word, liberating.
Of course, life was permanently different. A few months after I got out of the hospital I was riding in a car and we passed a trampoline park. I had the realization that I would never be able to go to a trampoline park. However, this was supplanted by another realization: as a person with two legs I had never elected to visit such a place. In fact, I had chosen not work on a particular day in 2012 so that I could avoid a work outing to a trampoline park.
There were of course things I had done that I would miss. After I had been on the prosthetic a few months, having learned how to use a car’s pedals with my left foot (very weird at first but now seems natural), I hit a bump and got a flat tire. In the old days, a flat tire was perhaps the easiest thing for me to fix on a car. Attaching a spare is a simple operation. Now with one leg, I found it nearly impossible. It’s very difficult because, having no ankle or knee, I cannot crouch or kneel. The hinge that functions as my fake knee doesn’t align exactly with my remaining knee, so if I try to kneel I tend to lose my balance and tip over comically. Have you ever stood upright and touched your toes without bending your knee or ankle? This is how I function. If I drop something and need to pick it up, or my shoelace is untied, this is how I must remedy the situation. That probably sounds more annoying than it actually has been. In reality, it doesn’t take long to get used to a change like that. But it severely affects my ability to maintain a car. I’m quite certain I’ll never change my own oil again because I can’t imagine how I would get beneath the car. I have enough trouble getting in and out of a car if I can’t swing my door all the way wide open.
And would you believe, this is all okay? I have a mechanic I trust and he does much better work than I ever could. It’s certainly more expensive, but unexpected expenses—even ongoing ones—are a part of all our lives. No, I won’t go trampolining anytime soon. I won’t be skiing. I won’t be skydiving. I won’t even be able to pick up speed to hurry when the light in a crosswalk is starting to count down. Regardless, lacking all of these things is perfectly fine. I was never passionate about trying any of them. To the extent my life is different now, I’m just used to it. I don’t remember what it was like to be able to wake up and spring out of bed in the morning. I’m just used to reaching for the apparatuses that comprise the prosthetic and attaching them when I awake. Pretty much everything else that’s different is the same way. Sure, when I am in a situation like the aforementioned crosswalk, I suddenly feel stress and am reminded that I’m not like everyone else. But that sensation is fleeting. Most of the time I’m able to let it go and resume my day with an emphasis on what is positive in my life now, and there’s much to be grateful for. I’m about to enter my second year of law school. I have a family who loves me very much. In all likelihood, I’m going to live a long time. I’m even working with a new pain doctor who might be able to take away the phantom pain once and for all, albeit with another surgery. And once again, I am sober and happy with the progress I am making in life. My needs, largely speaking, are met. I could ask for more and sometimes I do, but much (not all) of the time I’m simply content with things the way they are. I think I’ve got it pretty good.
What did I feel when I found myself on my back in that dark street, feeling my life was in danger, but closer to death than I knew? I felt the most extreme and overwhelming desire to live. I wanted to continue, whatever that meant. Muttering in the moments before anyone appeared, I begged God to save my life. I told him I would do anything it took. I told him I would live with whatever it meant. I just knew then, and believe now, that at the core of me the strongest drive is merely a will to survive. This should not be news, as survival appears to be the primary mission of all life on earth, but it’s quite a different thing for this drive to present itself to you at full volume and refusing to take no for an answer. When I do struggle with negative feelings, I try to remind myself of this simple desire, and be grateful that my prayers have been answered by somebody or something, or just fate. I don’t question it.
And what about cars? A few months ago, I had a taillight housing crack and break. Somebody must have backed into me, but I wasn’t there to see it. I’m happy to report I changed that taillight housing myself and felt proud and satisfied afterward. It was an easy replacement that didn’t involve kneeling or crouching, just a lot of unscrewing/screwing while leaning into the trunk. For me though, it was progress. It was something I could not have done a year prior. I don’t know what the future holds for me, but I now believe there’s no reason to think it doesn’t involve more progress and good things. It’s not worth believing anything else.
Here in my car
I know I’ve started to think
About leaving tonight
Although nothing seems right
This year, I also became the proud owner of (what I consider) a real flashy 1992 Buick Roadmaster Limited. I love the car and I drive it whenever I can. It’s old and it’s surely traveled some bumpy roads over the years, but like me, this car is a survivor. If you’re alive today to be reading this, so are you.