In an eight-month period between October 1997 and June 1998, a hitherto obscure filmmaker emerged to release two mainstream movies about technology and the future. His name was Andrew Niccol, and if it wasn’t yet a household one, his sudden success and unique talent seemed to imply it would be on the lips of the culture for years to come.

Gattaca, which Niccol wrote and directed, was released in October to critical votes of confidence and a disappointing box office, albeit one that couldn’t have surprised many keen observers of show business. In that era, star power was essential to opening a traditional studio movie. Ethan Hawke and Uma Thurman, while by that time having earned the status of movie stars, were nevertheless not the type of stars whose names could automatically fill seats in theaters. There was success to be had for movies without big stars in the mid 90’s; what was (often inaccurately) termed “indie” film, but the main determinant of that success was an arthouse credibility unavailable to a $36 million-dollar, mid-budget science fiction story like Gattaca was. In practical terms, art credibility in widely-released films was primarily ordained via purchase by Harvey Weinstein. This is why “indie” movies like Pulp Fiction and Good Will Hunting were granted underdog nobility despite featuring the stars of Saturday Night Fever and Mork & Mindy. Gattaca was simply never going to fit this mold.

If Gattaca earning back a mere third of its budget was a defeat for Niccol, it couldn’t have hurt too bad. The release of his next movie was already imminent, and it had easily enough firepower necessary to succeed on all fronts. The Truman Show, which Niccol wrote and Peter Weir (Dead Poets Society) directed, came out in June to immediate praise both in the voices of critics and in the handing over of stacks of money by the moviegoing public. This was the film that introduced Jim Carrey as an actor in serious movies. When it came time for award nominations, the film proved a contender there too, coming within striking distance of three Oscars and winning a handful of smaller awards.

Both Gattaca and The Truman Show take place in dystopias created by modern technology. Gattaca is about a future where genetically engineered human beings are a reality, and a class structure exists which oppresses people who were conceived naturally, branding them incapable and inferior. The Truman Show is about a world wherein a man in his 30’s (legally adopted by a corporation at birth) is unknowingly trapped in a dome so gigantic as to be visible from space, his every waking (and sleeping) hour broadcast to the outside world through thousands of hidden cameras, and all of the people in his life secretly actors. Both of these movies take issues that were on the minds of people in the 90’s—the immediate cracking of the human genome, and the ubiquity of media and camera surveillance—and creatively imagines what terrifying futures lay ahead if they are carried to their logical endpoint. As a teenager, I was blown away by both films, and more excited about Niccol than any of the other newly-minted filmic geniuses of that period. What’s more, I (and many others) were convinced that these stories were prescient, and were teaching important lessons about the future we would all soon have to contend with. Two decades later, it is evident I was wrong on all counts. Movies don’t predict the future, and neither can I.

How able are we ever to see what the future holds? How do we know the things that will matter tomorrow are the ones we are choosing to concern ourselves with today? Try this experiment. Pick a day in recent history. One at least ten years in the past. Try your birthday. Then go to a library (or the internet) and find the front page of a newspaper from that date. Look at the stories that appear there, then ask yourself two questions. 1) What were the major events in history that transpired between then and now? and 2) What evidence is there on the page that anyone was contemplating those events in a meaningful way, before they transpired? The answer to the first question will vary depending on which date you choose, but the answer to the second question will almost invariably be that the front pages of newspapers do not forewarn of what will be written in the books of history.

I was born October 29, 1982. My adult life has been defined by two historical events. The first is the attacks of September 11, 2001, and the second is the 2008 Financial Crisis. I was 18 years old on 9/11, and approximately 25 when the economy fell apart. When I recently examined a New York Times from Friday, October 29, 1999 (20 years ago), I was surprised by the utter lack of reporting about the issues that would underpin world-changing events within a decade. Just kidding. It met completely with my expectations, because it was a totally contrived exercise meant to allow me to write the preceding paragraph. But consider the point anyway. That day, the Times’ front page was shared by several stories then germane to our national discourse: assisted suicide in Oregon; the prospect of across-the-board cuts to the federal budget; the meeting of five Republican presidential hopefuls not named George Bush. The sole photograph above the fold is actually about the Falun Gong religious sect and their plight under the Chinese Government. None of these things are in the news today. But with the exception of Bush’s inevitability as the nominee, they didn’t really come to any conclusion either. They simply petered out in the national conversation, displaced by things of greater interest. If there’s a point to this exercise, it is that there is very little predictive value in what we consider important enough to talk about in the present.

While terrorism and the economy were written about (the former sometimes, the latter often) in the New York Times (and the media as a whole) over the years leading up to 2001 and 2008, there was a decided shortfall of handwringing about the potential danger of the two aforementioned calamities occurring, right up until the point at which their impact was undeniable. That isn’t to say that nobody predicted these things. In the wake of both were born celebrities whose exclusive claim to relevance was such prognosticating. For 9/11, it was former national security official Richard A. Clarke, who went from being known (if at all) before the tragedy as someone difficult to work with, to today being the author of eleven books (which include four novels, as well as 2017’s earnestly-titled Warnings: Finding Cassandras to Stop Catastrophes). In the Financial Crisis, it was chiefly the investors profiled in Michael Lewis’ The Big Short, who are probably more famous because of Adam McKay’s star-studded film version. In both instances, these were people who held firm to an unpopular view at their peril. In retrospect, they seem like nothing short of reluctant geniuses for their foresight.

In the aftermath of Gattaca and The Truman Show, I expected Andrew Niccol would eventually be adjudged a genius on par with those predictors. It might take a long time. It might take my whole lifetime. But it was going to happen. Over the years, those narratives would come to seem increasingly prescient and these films would to be regarded something like what George Orwell’s literature was in my youth: a warning about the future which, while never coming to literal truth, more and more is seen as having predicted the issues which the world many decades ahead of its creation would contend with.

In the short term, the evidence began to mount. At a celebratory White House press conference characteristic of the leaps and bounds that all of technology seemed to be taking at the time, President Clinton announced the Human Genome Project complete in June of 2000. In his remarks, the President compared the mapping of the human genome to the “frontier of the continent and of the imagination” that Thomas Jefferson had set out on a map in the same room two centuries earlier, in the wake of the Louisiana Purchase. “It is now conceivable,” said Clinton, “that our children’s children will know the term ‘cancer’ only as a constellation of stars.”

In Gattaca’s near future, Hawke’s Vincent Freeman is especially occupied with knowing the constellations of stars. His dream is to go to space as a celestial navigator for the Gattaca Corporation, an entity something like SpaceX, if it had the corporate culture of IBM in the 1950’s. But Vincent’s experience in the genetically enlightened future is not what President Clinton was picturing. Vincent was born a “god child,” as Thurman’s Irene Cassini phrases it in a revealing bit of pejorative slang. He was not genetically engineered, and therefore imperfect, so this ideal job opportunity is closed to him. In fact, when his genome is mapped (at birth), it is determined that Vincent has a high probability of several genetic disorders and a projected lifespan of only 30.2 years. But the force of his will is not to be underestimated. With a little luck, a lot of work, and one very intense sacrifice, Vincent is able to purchase and assume the identity of Jerome Morrow (Jude Law), a perfect human specimen whose competitive swimming career was cut short when he was paralyzed in a car accident, and who has now withdrawn into alcoholic self-annihilation. We join the narrative with Vincent already established in his double life, and the tension of the movie flows from whether he can keep up the ruse long enough to board the ship on his launch date and venture into space, or whether he will be caught by detectives who are investigating a murder which has taken place at Gattaca, and who are suddenly scrutinizing its workforce to determine who might be an “in-valid” (a person masquerading like Vincent is).

I didn’t think my children’s children would one day live in Gattaca’s oppressive dystopia, literally. But I knew they sure weren’t going to wake up in Clinton’s rosy future—where suffering is but a memory—either. This new frontier was, in all likelihood, going to be more similar to President Jefferson’s than President Clinton had meant to imply with his analogy. For someone, suddenly living in the genetic future was going to be like suddenly living in the United States of America was for the native peoples who had inhabited the area of the West since before recorded history. That is to say, the future would be something substantially less than the complete success which the Human Genome Project formally declared itself in 2003.

Elsewhere, The Truman Show yielded changes even more immediate. Within a decade of its release, the biggest thing happening on television was programming about the actual lives of real people. “Reality TV” had existed prior to The Truman Show in embryonic form, as FOX followed the men and women of law enforcement (and the men and women of humiliating contacts therewith) on Cops, and MTV was figuring out what happens when seven strangers are picked to live in a house together and have their lives taped on The Real World, which was said to be where people “stop being polite… and start getting real.” While these shows were successful during the 90’s, they were nothing compared to what was to follow. Reality TV, in its fully realized iteration, began with game shows like Survivor (which was an immediate hit in 2000 for CBS and remains strong today) which encouraged human beings to behave at their worst and betray each other for their own gain. While televised competition had existed since the inception of medium, the history of game shows over the decades like Twenty-One and Name That Tune through Wheel of Fortune and Supermarket Sweep had failed to weaponize the darkest impulses of fame-and-money-hungry participants the way Reality TV’s game shows (like the aforementioned Survivor, Temptation Island, The Amazing Race, and so many others) began to in the early 2000’s. It helped, of course, if the worst people were selected for the mission, and then placed in contrived pressure situations, like animals forced into combat to perversely amuse spectators. Reality TV’s producers (themselves probably not our civilization’s best exemplars of ethical virtue) dramatically innovated in all of these dark endeavors. It was an entirely logical progression when, from this sinister ether, Donald Trump’s The Apprentice oozed into being in 2004.

Soon it became increasingly clear that the participants needn’t even be competing over anything for their worst traits to be exploited. From MTV’s hit with Laguna Beach: The Real Orange County (2004) to a success probably more monumental than any other in the history of cable television with Keeping Up with the Kardashians (2007), the ordinary doings of people (even if those people were not ordinary, but instead very rich and subsequently very famous) were what defined Reality TV as the years marched on. The idea was, to increasing dollar value, that you would point a camera at someone, start rolling, and capture not a game show, or a scripted drama, or a sporting event, or a news broadcast, but simply the mere way they had chosen to spend their time, and it would be compelling for people at home to watch. This absolutely seemed to validate the thesis at the heart of The Truman Show.

Carrey’s Truman Burbank, of course, is much more mundane than all of that. He has lived his entire life on Seahaven Island, a chokingly pleasant seaside community where Truman sells insurance and lives with his wife, Meryl (Laura Linney). In his spare time, he enjoys driving golf balls with his best friend, Marlon (Noah Emmerich). Jim Carrey plays Truman in the same tones in which Peter Weir captures most of the rest of the film: the happy-go-lucky stereotype of 1950’s and 60’s television. The whole thing looks like an episode of The Donna Reed Show or My Three Sons, but in bright colors. Where drama and conflict exist in this world, it doesn’t verge much beyond that experienced by characters on The Many Loves of Dobie Gillis. All of the other characters (like Meryl and Marlon) are paid actors, with Truman being the only person having an authentic experience on his show, albeit one wherein he is blinded to reality as though chained in Plato’s cave. That all begins to change in the film’s opening moments, when a production light falls to earth from the ceiling of the enormous dome that actually houses Seahaven Island and shatters, like Truman’s idea of his reality will soon begin to.

Niccol originally conceived of The Truman Show as something much darker. The 50’s sitcom aesthetic in the eventual product is the influence of director Weir, who had Niccol write sixteen drafts before he considered it ready for celluloid. One of those older drafts has been (supposedly, there’s no verifying the document’s authenticity) available online for years, and it is much more like Gattaca than what audiences eventually saw in the summer of 1998. While many of the events are the same, they are tonally different. In the movie, Jim Carrey’s Truman dreams of leaving Seahaven and reuniting with the girl who got away (was fired). The original Truman had similar feelings about his environment, the more urban “Utopia, Queens.” But only the latter reacts to his dissatisfaction with life by secretly consuming Jack Daniels in his car in the morning. That original draft has much more the feeling of being a thriller about escape from a world that is a sort of prison for its protagonist, like Gattaca. Other traces of the original, dark Truman remain as well, sometimes in surprising ways: when Truman has begun questioning his reality and begins to be overtaken with paranoia, Meryl attempts to diffuse him. In a brief moment, perhaps the film’s most dramatic beat, he yells aggressively at her. While it is as caged in comedy as much of the rest of the film, Truman violently grabs Meryl, and in that moment she is afraid for her life, which is exactly how Linney plays the interaction. If The Truman Show was made as Weir’s comedy in 2019, I do not believe this moment would make the final cut. The escape element survives into Weir’s film, dominating the third act, but the director has to painfully remind us of how unserious this business is by concluding the movie with Carrey repeating his saccharine catchphrase, “Good afternoon, good evening, and goodnight!” before departing the dome to continue his adventure without us. Niccol’s old draft has a more conclusive and confirmative happy ending for Truman (it implies he finds the girl and they have a child), but his last line, spoken before the aforementioned bliss is told merely through images, is actually “Something had to be real!” shouted in the moment he confronts his creator, Christof, who was eventually played in the film by Ed Harris in a performance that was Oscar-nominated. In that moment, within the original draft, Truman is also himself being confronted by the vapidity and fakeness of the reality he has spent 34 years experiencing, and the stage direction Niccol leaves for the actor describes the aforementioned line as “a terrifying anguish.” In Weir’s film, Truman’s overall reaction to the revelation could be described as something more along the lines of “golly.” Earlier in this essay, I described The Truman Show as the film that introduced Jim Carrey as an actor in serious movies, rather than calling it the one which introduced him as a serious actor. This wasn’t by accident. That movie was Man on The Moon, which came out 18 months later. There is serious acting in The Truman Show (particularly by Harris and Linney), but Carrey doesn’t do much of it. His performance doesn’t veer into anything too graduated from what audiences saw him do as the bumbling dad set upon by a supernatural curse in Liar Liar a year earlier.

It didn’t take a Richard A. Clarke-level genius to note Truman’s foresight about Reality TV in the decade following the film’s release. If The Truman Show was mentioned, typically this connection was drawn. In 2008, Popular Mechanics named it one of the ten most prophetic science fiction films of all time. In that piece, Erik Sofge argued that, while Truman’s environment is unreal, so is the contrived environment of Reality TV, but what is compelling about both is the same element: “Truman simply lives, and the show’s popularity is its straightforward voyeurism.” In his analysis of the film’s plausibility, Sofge took issue (perhaps appropriately for a Magazine that is supposed to be about science) only with a futuristic weather-machine Christof uses in an attempt to slow Truman’s escape. Weir has also commented on the ensuing developments on the small screen, saying in a 2005 DVD extra: “”This was a dangerous film to make because it couldn’t happen. How ironic.” Reality TV’s success might discredit Weir’s choice to make Truman’s a comedic world, which even at the time of the movie’s release he said he had done because he didn’t believe it would be convincing to film audiences that fictional (meta) television audiences would watch a show about a regular person. While it would be absurd to call The Truman Show a failure for Niccol (it, much more than Gattaca, is the reason people are willing to underwrite his movies to this day) time has seemed to imply his unfilmed draft was more on the right side of history than Weir’s realized production.

Perhaps, when sitting down in the early 90’s to write screenplays which were unlikely to ever be produced, all of this was obvious to Andrew Niccol. In retrospect, with Reality TV’s progenitors already on the air and the Human Genome Project an inevitability since at least the discovery of DNA’s structure in 1953, it should have been obvious to everyone else too. But in retrospect, it’s easy to label what has happened as obvious.

From the comfortable vantage point of retrospect, both 9/11 and the Financial Crisis seem completely obvious. Take 9/11: how could one have predicted that Osama bin Laden would strike the United States? For starters, al-Qaeda had made no secret of their ambitions. It’s popular (and not incorrect) to believe that the national security establishment failed to heed the danger of a major domestic terror attack by Islamic extremists of the sort already very active elsewhere in the world. But the general public can’t credibly claim ignorance either. Bin Laden gave several interviews to western reporters during the 1990’s. Sure, a lot of people threaten a lot of crazy things, but anyone who believed him all-talk would have done well to notice when al-Qaeda literally attacked the United States in 1998 by bombing its embassies in Kenya and Tanzania, and again in 2000 when the USS Cole was bombed in Yemen. If these events seemed too remote by virtue of their place on foreign soil, there was then the (thankfully averted) “millennium” plot to bomb Los Angeles International Airport in December of 1999. And if all that wasn’t enough, of course, extremists linked to al-Qaeda had already bombed the World Trade Center in 1993. These things were not covered up or classified. All of them made The New York Times.

Still, the hijacking and weaponizing of commercial airliners was unprecedented, right? This is a dubious assumption as well. The hijacking of commercial flights by terrorists was routine during the 1970’s. This is well known. Less well known is the story of Samuel Byck, a mentally unstable Pennsylvanian who was killed (but not before managing to kill a pilot and a police officer) in February of 1974 while attempting to hijack a plane at Baltimore/Washington International Airport, so that he could fly it into the White House and assassinate Richard Nixon. This is not common knowledge but again, not a secret—it was reported at the time. Byck was actually known to have threatened Nixon after being denied a loan by the Small Business Administration, and had even been interviewed by the Secret Service, but they had believed him harmless. All-talk. While Byck never got off the ground, that didn’t stop it from being a bad week for air-security around 1600 Pennsylvania Avenue: five days earlier, a jilted army private named Robert Kenneth Preston had stolen a military helicopter from Fort Meade, Maryland and (after taking a flying tour of the Capitol and successfully evading the jet fighters sent to intercept him) landed it on the South Lawn, whereupon the Secret Service opened fire on him with automatic weapons and shotguns. Miraculously, he suffered only superficial wounds such that he was able to depart the aircraft and charge the building, where he was finally tackled. For those not counting, these events took place approximately three decades before 19 young men set their alarms early on a Tuesday to change the course of history. It bears mentioning that this is the same White House as is widely believed to have been the target of United 93, the one 9/11 plane on which the terrorists were prevented from completing their objective, when a revolt by its passengers caused the plane to crash in rural Pennsylvania, killing all on board.

The Financial Crisis is both more complicated and more predictable. The complexity is inherent because of the innate complexity of economics as a discipline; like the Great Depression before it, the various causes (and various solutions) of the Financial Crisis will be debated for long as the event is remembered, even if only in academia. I have read five dense books on the subject, and remain wholly unqualified to deconstruct the systematic risk to the economy that had come to exist in 2008, let alone advise with any confidence how a repeat could be avoided. Many would tell you that greed was the cause of the crisis, but this is a bad explanation. Greed exists and is often a destructive element in human affairs, but greed was not invented at Bear Stearns or Lehman Brothers. Greed is as old as human agency. It is perhaps as simple as being the economic manifestation of hubris. The Biblical cliche is “thirty pieces of silver,” not “thirty credit default swaps.” There was plenty of greed in every prior Wall Street collapse, as well as in every celebrated boom. While relative compensation had increased in the financial sector (even if to a sick degree in its ostentation) by 2008, mere greed cannot explain why we got this crisis when we did. Again, the reasons this happened are complex. They do not fit a moralizing agenda intended to teach a parable about greed, no matter how true it is that our society includes many who need to be so taught.

Nevertheless, it seems inarguable that the crisis would not have transpired without the failure of the housing market that began in 2006. Why the housing bubble grew, and burst, is the subject of entire books itself, but it isn’t difficult to summarize. Picture the least financially-stable member of your extended family. For purposes of this discussion, call this person “Cousin Larry.” Maybe Cousin Larry is irresponsible and reckless in managing his money, and leaves you forever shaking your head at the decisions he makes. Or maybe he is solid and trustworthy, but circumstances beyond his control have always left him barely able to keep his head above water. His character isn’t what’s important. The variable to pay attention to is his financial instability. He may have had a lifelong difficulty in finding a job that provides him a sustainable income, or he may be quite well-compensated but nevertheless always wind up with liabilities that exceed his assets. Now, ask yourself the following question: are you willing to co-sign a mortgage for a five-thousand square foot home in an expensive neighborhood for this person? If your answer is anything other than “Yes. And the nation’s entire banking system, private pension infrastructure, and the federal government should all co-sign the note as well. Oh, and is there anything available with six-thousand square feet?” then you are wiser than former Federal Reserve Chair Alan Greenspan. Basically, in the early 21st century, the nation’s entire economic viability had come to be riding on whether or not Cousin Larry was going to be able to keep up with his debt. And by 2006, Cousin Larry had the aforementioned home, as well as mortgages on three other properties that he was trying to “flip”, to take advantage of the hot housing market that the rampant buying of so many cousins-Larry had created, not to mention the fact that he was also making substantial payments on a new Hummer financed over six years, which he used to travel back and forth from The Home Depot where he bought supplies to renovate those other properties, purchased using yet a third kind of debt in the form of a high-interest credit card. Sound ‘sound’, so to speak? And through a myriad of complexities which, like the threat of terrorism before 9/11, were not a secret, practically the whole of the economy had been bet on whether Cousin Larry’s plan was sound. It must be said that, in the devastating event that followed, many good people who were not stretched as thin as Cousin Larry would lose their homes to foreclosure. Cousins-Larry were never a majority of American home buyers. But without the economic behavior of Cousin Larry, there would never have been a crisis so large as to engulf them.

Cousin Larry

If you have a real-life Cousin Larry whose behavior you have observed over the years, you would probably describe how things worked out in the economy as entirely predictable. And while some did point out the growing risk in the system, they were few and far between, and more seldom heard. If greed were really the factor to pay attention to, all of the greedy people would have taken The Big Short’s eponymous short play, as it was the key to untold riches for those who believed in it. But the greedy instead pressed on with their flawed confidence in housing as bulletproof, and the collapse of 2008 burned them too. Perhaps this is because, in any bull market, there is always a contingent predicting impending doom. It’s never too bad a prediction. Financial downturns have occurred regularly throughout our nation’s history. The question is never if, but simply, how long until. In a world where naysayers can reliably exist, they can reliably be ignored. The 2006 housing bubble is now to the 2008 collapse what al-Qaeda’s attacks in 1998 and 2000 are to 9/11: an utterly clear warning about what was to come, but one that only shines with that clarity through the lens of hindsight.

I say it with the confidence exclusive to someone writing more than a decade after either event. Full disclosure: I am the last person who saw these things coming, or even recognized them for what they were, when they did. When the first plane hit the World Trade Center, for example, I was watching a rerun of Beverly Hills, 90210 on the FX Network, and spent an embarrassing amount of time believing (as the newscasters repeated for an embarrassing amount of time) that it was an ordinary accident involving a small aircraft, and wondering when the news coverage that had broken into my broadcast would abate, and 90210 would resume to reveal to me the fate of Dylan McKay within the episode’s narrative. While both 9/11 and the Financial Crisis seemed totally unprecedented as they were unfolding, it was still not clear contemporaneously that these events would forever change American life. Even after the ink was dry on national legislation like the Authorization for Use of Military Force Against Terrorists, The USA Patriot Act, or the Dodd–Frank Wall Street Reform and Consumer Protection Act, how much bearing these new laws would have (and how permanent they might be) was an open question.

The permanence of 9/11 was still up for debate (though not much debated) by the time another Andrew Niccol movie came out in August of 2002, when a second Iraq War was still a hypothetical. Simone (sometimes called S1m0ne), which he wrote and directed (as he would all the films he has subsequently made, rendering his collaboration with Weir an outlier), debuted in ninth place. The movie earned just shy of $4m during a crowded summer weekend at the box office, and would go on to generate less than $20m worldwide; a feat that even The Crocodile Hunter: Collision Course was able to manage that summer. But unlike the similarly-performing Gattaca, the critics weren’t so much on Niccol’s side this time. Even if the reviews weren’t uniformly bad, they didn’t think anything particularly interesting was going on in this movie either. For a filmmaker who had, by that point, built a reputation for creating worlds that were thought-provoking, this is probably the unkindest possible cut.

That Simone is uninteresting as a narrative motion picture, fails to take itself seriously in the worst way, and further fails to deliver even on its limited premise, are all points I will not contest. I absolutely believed all of these things when I saw it in 2002. But if I had instead reacted by going to grad school and making a life for myself as an academic specializing in the work of Andrew Niccol, I have no doubt I could be on my 50th journal article about Simone by this point. I could be the world’s foremost Simone scholar. This movie is awful, but still I have that much to say about it. Simone is a masterpiece of recursive irony, none of it intentional.

The movie is something akin to what was once termed a “screwball comedy.” These were satires from the 1930’s and 40’s that typically center on a humorous battle of the sexes. In 2002, the film was classified by some as science-fiction, but anyone watching it today will find this reasoning as veracious as the contemporary notion that Saddam Hussein was on the verge of possessing a nuclear weapon. The Crocodile Hunter movie I mentioned earlier is actually about Steve Irwin mistaking CIA agents for poachers after a crocodile swallows a top-secret U.S. satellite beacon, but that doesn’t make that movie a “political thriller.” So no, Simone is not science-fiction. While a technology that doesn’t yet exist may be integral to Simone’s story, this is entirely secondary to the shallow satire of Hollywood’s golden age that this movie spends 118 minutes concerning itself with.

Al Pacino stars as Viktor Taransky, a film director at his wits’ end with the temperamental actress Nicola Anders (Winona Ryder). Get it? “Nicola Anders”? Like “Andrew Niccol”? Neither do I. Anyhow, when she storms off his set and refuses to complete his latest movie, it appears the project will have to be abandoned, which likely will put the final nail in the coffin of his flagging career. But, as fate (fate being primarily characterized in this world by its zaniness) would have it, a dying oddball computer programmer leaves a secret hard disk to Taransky in his will. When the director turns it on, voila—he is met with an advanced piece of software known as Simulation One (or Sim-One, or Simone, get it? perhaps) which is a computer-generated actress played by the model Rachel Roberts. Taransky uses Simone to replace Anders in his movie, which is then a smash hit. The public falls completely in love with what they are led to believe is a real person, and Taransky is back on top. But how long will this ruse last? And how many bits of forgettable comic business—like Pacino using a Barbie to cast a silhouette on hotel drapes to fool the paparazzi, or putting lipstick on himself to kiss autographed headshots (yes, this literally happens)—can Niccol pencil into this film’s remaining two-thirds before concluding things with an ending that is both totally implausible, and 100% predictable?

The main flaw here, in a film with many, is that Roberts’ Simone has no personality. Her lines—which in the world of the technology onscreen are generated by Pacino reading them into a microphone, and the computerized Simone then parroting what he does, resulting in a presentation that haunts my nightmares—always have the flatness you notice when people not accustomed to being on camera read from prepared statements at press conferences, or when hostages are forced to do the same thing by their captors. “This is Simone, and I am making this statement of my own volition.” It all has that flavor. An obvious answer for why is that Rachel Roberts is primarily a fashion model, not an actress. That’s certainly what I assumed to be the case when I saw this movie in 2002. But the reality is something more complicated. Roberts hasn’t acted much since Simone, but when she appeared as a hippie who introduces Don Draper to hasheesh in a 2013 episode of Mad Men, none of the robotic stuff was evident. It was 100% believable and charismatic. This, despite the fact that storylines on Mad Men about the subject of Don Draper and hippies were about as effective as songs by Right Said Fred about subjects other than the problem of being too sexy. I had to take an acting class to graduate college, and after a semester of coursework, the main thing I had learned was that acting is not easy. Taking words off a page and making them sound natural from your own mouth (to say nothing of demonstrating high-level emotions) is really hard. Nevertheless, the role on Mad Men and a few other sporadic examples definitely attest that this is a talent Rachel Roberts possesses. She may never be Meryl Streep, but she’s definitely not the actress one would think if this movie were all that they had to judge by. So something else is going on here. While I can’t say what that is, I can only surmise that this is the performance Niccol wanted her to give. This, to her misfortune, is the product of his direction. Perhaps he was concerned that having a real person play a computer-generated actress would not be effective otherwise, the way Weir thought a Truman Show wouldn’t be believable of it resembled actual real life. This isn’t a bad thought, except for one reality that can’t be ignored: the entire plot of this movie hinges on the supposition that moviegoers (and literally everyone else, her impact as a public personality comes to grow beyond that industry) would see Simone and fall wildly in love with her. At one point, tuxedoed partygoers run crazed and topple over into a body of water merely because they mistake another woman from behind for Simone. This is the reality (however heightened) Andrew Niccol wants us to accept in this movie. And Roberts’ android-like presence is what that reality is predicated on. While, like I said, I don’t believe this is a science-fiction movie, the problem at work here is one you typically see in that genre.

In science-fiction, the rules can be whatever you want them to be. The only limiting principle is that they have to be consistent. If George Lucas had wanted to give Luke Skywalker the power of flight in The Empire Strikes Back, he could have done it. Luke does other things that are vaguely super-powered. It wouldn’t have changed too much about the movie. Audiences in 1980 would have just accepted that this was one of the ways he uses the Force, and nobody would have said a thing about it in the last 40 years. What George Lucas could not have done, is give Luke the power of flight when he’s training with Yoda, and then had him still fall down that shaft and out into open air, clinging for his life, at the end of the movie after he fights Darth Vader. Audiences would have said, “Why doesn’t he just use that flying power we saw him demonstrate half-hour ago?” The rules wouldn’t be internally consistent. Andrew Niccol cannot have Rachel Roberts give this robotic, remote performance, and then not only ask us to believe that people in this world wouldn’t question her reality, but would further fall more into psychotic love with her than teenage girls do with the Beatles in A Hard Day’s Night.

The power of Simone’s failure on that point is astonishing. It is so sweeping as to make Catherine Keener seem like a bad actress. I’ve never had this reaction to Keener, even in a bad movie, but Niccol’s ability is such as to be able to bend reality to this Herculean end. Keener plays Pacino’s ex-wife, and his boss at the film studio. Through a series of misunderstandings, people begin to believe that Taransky and Simone are romantically involved, when what the director actually wants is to reunite with Keener’s character so that they can be a happy family again with the the daughter they share (a very young Evan Rachel Wood). Just when it seems that Taransky has leveraged his imaginary romance with Simone into successful manipulation of his wife toward this goal, and they embrace, she abruptly breaks it and pulls back: “Oh Victor, I could never compete with Simone. How could any woman?” she says. This utterly unconvincing scene is the worst acting of Catherine Keener’s career, and nobody could blame her.

I’m not ordinarily one to criticize movies for being sexist. The kinds of movies whose sexism I might find objectionable are typically ones I avoid seeing altogether. The horror genre contains many such entries. Those are obvious. But I realize there are other, more subtle ways for sexism to figure into a movie, and this is further complicated by the fact that one’s impression of art is a pretty subjective experience. That said, the sexism in Simone is pretty difficult to ignore. Winona Ryder has too much agency, and she’s too mouthy, so Al Pacino replaces her with the ideal woman: a machine whose every move is literally the execution of his will. Then he uses that fake woman to manipulate yet a third woman (Keener) to have sex with him, but that plan fails only because she is stupid enough to believe his sham. When even the fake woman becomes too much of a problem, he (and SPOILER ALERT here) LITERALLY MURDERS HER. Sure, she’s not real, but that doesn’t stop him from accidentally being framed and jailed for her killing in a sequence that’s pretty dark, before a poorly explained deus ex machina returns things to normal and the credits roll. 2002 was (was it?) a less enlightened age in mainstream entertainment, but I’m pretty sure we had already evolved beyond onscreen worlds where women must exclusively be actual puppets for the male protagonist without any identity of their own, or, at their most evolved, vindictive harpees who have the sole function of thwarting his will. If I’m reading into things here, in fairness to me, those things are written in some pretty large print.

It’s arguable who gives the best performance in Simone. It’s probably Ryder because, though her role is limited and not well written, the actress is always a compelling presence. But I would argue an alternate theory: the bright spot in the movie is Evan Rachel Wood as Taransky’s daughter. This is the relationship that comes closest to seeming like something that is both real, and worth paying attention to. This is very ironic because, more than a decade later, Wood was nominated for a slew of awards (including an Emmy and a Golden Globe, as well the Critic’s Choice which she won) for playing an actual robot on HBO’s Westworld. That show, which is both lauded and controversial, takes an entirely different path in addressing artificial humanity than Niccol did. It presents androids as not differentiable from human beings, in a world where the humans generally know they are fake, and imagines the ethical complexities that would result. The result is a much more engaging and realistic world than a single frame of Simone manages to be.

But all of this isn’t why I could write 50 papers about Simone, even if you feel like you’ve read that many by this point. What’s really interesting about this movie is the context. It is sometimes credibly argued that any piece of art is about its creator more than anything else. This is analogous to the truth that any dream is about the dreamer. If you dream about your parents, you will not learn more about them from the dream, because they weren’t actually there and whatever they were doing was just a manifestation of your own thoughts and feelings about them. But in appraising that dream, you may learn something you didn’t consciously realize about those thoughts and feelings, and hence, about yourself. By this theory, all of Andrew Niccol’s movies should communicate something about him as a person, and his particular worldview. If this is true for any film, it’s doubly true here, because this movie is about a film director. Not only is that Niccol’s actual profession, but it’s also a very unique profession.

The power of a film director over the people he or she controls is enormous. It is the nature of big studio films that these people generally have absolute control over armies of hundreds of employees for the duration of a production. Anyone who has spent any length of time on a film set knows that, while a director may be (and often is) a very kind person who respects everyone’s dignity, those who are not are generally allowed to flaunt their power without any accountability for how outrageous their conduct might be. Producers and others who control money can control directors, but it is completely ordinary for producers to stay out the way, barring something totally unusual or, more often, prohibitively costly. Producers often spend their working days miles away in offices, where they control and subjugate their own separate army. The set as the director’s domain is absolute. When a director behaves badly, that director’s army will generally still execute their will unquestioningly. At no level is the work like that of an ordinary job. The culture is that, if one challenges something they believe is wrong, they must do so with the very real fear that not only will they lose this job, but be branded as someone difficult to work with in this entire, very insular industry. The director Stanley Kubrick is said to have abused the actress Shelley Duvall with such acute psychological torture on the set of The Shining as to nearly ruin her as a human being. Kubrick sought to isolate her such that she felt totally alone. He frequently told her she was wasting the time of everyone on set. He can be heard in a recording telling others “Don’t sympathize with Shelley.” I have had bad bosses, but never has one instructed a coworker to deprive me of sympathy. In any other work environment, it would be simply preposterous, if it wasn’t primarily so twisted an annihilation of a basic human need. But this is art. Kubrick is and forever will be hailed as a genius.

Many revelations of recent years, in the era of #MeToo, are unsurprising when viewed in light of this information. There’s a lot of finger pointing about who knew what about the abuses of Harvey Weinstein, but nobody who came anywhere near his orbit can credibly claim they didn’t know he was a tyrant. Like a tyrannical director, a tyrannical producer such as Weinstein is often celebrated for the force of will that underpins their artistic conquests. In addition to his closet full of Oscars, Weinstein was honored by GLAAD, the Producer’s Guild, and Harvard University. Nothing I have thus far alleged about Weinstein was controversial to say when those awards were given. Peter Biskind wrote Down and Dirty Pictures, an entire (not obscure) book chronicling Weinstein’s megalomania, way back in 2004. It didn’t seem reason enough, to Hollywood, to challenge Weinstein’s reputation as a global humanitarian. Somewhere, a former low-level Miramax staffer is right now authoring a check to a psychiatrist, and disagreeing. That Weinstein was also a sexual predator whose power-madness was so extreme as to motivate him to trap journalist Lauren Sivan in a restaurant basement and force her to watch him masturbate into a potted plant is shocking and disgusting, but it is not revelatory about his character. It completely fits with what everyone already knew about him 20 years ago. Biskind’s book seems today, at best, incurious for having missed Weinstein’s predations with women, but that book does conclude with the mogul physically assaulting a journalist. Again, published in 2004.

This is the film industry. It is not like other industries. To many who work in it, the only part of 1994’s Swimming with Sharks (a breakout role for Kevin Spacey as a monstrously cruel Hollywood boss) that seems beyond belief is its tidy ending. Many people, in other fields, have had a boss they think of as cruel, or pervy, or who crosses boundaries, but media is simply peerless in this regard. While #MeToo may go a long way toward stopping the sexual crossings of the line, it may be limited to that arena. “Okay, no more sex stuff,” one can imagine these sociopaths limited their self-appraisals to. It’s very likely that ordinary plutonic abuse of human dignity will continue to be tolerated. That psychiatrist is going to be in business for a long time. Whether or not Andrew Niccol meant to graze these issues, this is the same Hollywood which he chose to write about in Simone.

Now I don’t know anything about Andrew Niccol as a human being, or as the manager of a film set. I have literally not a single reason to think he’s a showbiz sociopath of this sort. When he is interviewed, he seems like a quiet guy from New Zealand, which is probably what he is. But any person, no matter how nice, takes the director’s chair with the full weight of the foregoing information about Hollywood power dynamics. This is what you should bear in mind, when I relate the following facts as I understand them:

When Andrew Niccol wrote and directed Simone, he was a film director in his thirties, and married with a young daughter. He chose to write this movie about an older film director, who is divorced with a teenage daughter, and who becomes the subject of rumors that he is having an affair with his young artificial movie star, who is actually his Frankenstein creation. He cast, in this role, Rachel Roberts, a fashion model, which is a role in society generally associated with artificiality. On August 13, 2002, he attended the premiere of Simone with his wife, the actress Susan Jennifer (Grace) Sullivan, who is best known for Friday the 13th Part VII: The New Blood, a movie where Jason murders her by slamming an axe into her face. Niccol further chose, in crafting his movie, to have the movie director character be accused of the murder of his young actress. What happened next is unclear. The gossip pages don’t usually report on obscure film directors, nor actresses if they are not otherwise of note. However, when 2002 ended, Niccol was no longer married to Grace Sullivan. Sullivan later became the subject of an internet rumor that she had died in 2009, which was so pervasive that IMDB to this day lists her as deceased, though online Friday the 13th fans claim to have debunked it, and have compelling proof that she was alive at least as recently as 2013, though she has battled cancer and shared her story of overcoming it online. However, that very strange tale is only a digression to the fact that, by the time 2002 ended, Andrew Niccol had somehow become the husband of Rachel Roberts.

There are ironies within ironies within ironies to explore in this odd, incomplete narrative. Niccol and Roberts are married to this day, and she appears in small parts in nearly all of the films he makes. Perhaps he expected Simone’s audience would take for granted that she was lovable on the screen because he had fallen in love with her, and was too blinded by his love to realize that what he saw in her was not translating in the android-like screen presence he imbued this movie with. Like I said, Andrew Niccol seems, in interviews, like a very low-key guy. He is not crazy. He is a successful auteur and will probably live happily ever after, as much as anyone does outside of the movies. But I guarantee you that if ever he were to achieve infamy for some reason, like by committing a sensational crime the way celebrities sometimes do, Simone is the exclusive film of his that internet crime-obsessives would parse for clues. Its self-examining rabbit hole would take up the whole screen.

Out of Niccol’s visions of technology’s future, Simone is both the most mundane and the one that seems most inarguably certain to be reality in the near term. We can already create digital actors, and one day soon they will be as vivid as real people. Whether or not an artificial celebrity will eventually conquer pop culture the way Simone argues, time will only tell. The takeaway is, Niccol fails in convincing viewers that it’s possible in the world he authors here.

Ultimately, Simone’s achilles heal is its misconception of human psychology. In this situation, faced with this technology, people would simply not act as Andrew Niccol imagines they would. As time went on in the years following the release of all three of his aforementioned films, it became apparent that something like this was plaguing his work from the beginning. In 2003, Spike TV tried to create The Truman Show as a literal concept. The Joe Schmo Show was about an ordinary person, Matt Kennedy Gould, who believes he is having ordinary interactions with other ordinary people. Unlike Truman, Joe Schmo knows he is on a reality competition show, but that is all he knows. He is unaware that all of the other characters are actors and what is happening to him is a scripted contrivance. The result? It wasn’t compelling television. Joe Schmo’s was not the face that launched a billion dollar enterprise like the kind pictured in The Truman Show. It ran for two years and was a hit relative to the size of Spike’s audience, but did not make waves, even as Reality TV was picking up steam. Spike tried to revive the concept in 2013, but it didn’t last then either. Gould’s reaction to learning of the hoax in the first season’s finale also rebuts Truman Burbank’s authenticity as a character. While Joe Schmo did plaintively cry “What is going on?” he then received a $100,000 prize and enjoyed brief fame when the episodes aired, fully cooperating and not demonstrating a feeling that he was betrayed, despite the lies.

Truman’s lack of authenticity is apparent when the film is viewed today. Again, this is a product of retrospect bias. We know today what works and doesn’t and how, on Reality TV. If you have teenage children who enjoy Reality TV, they were not born when this movie was new. If you try showing it to them now, and advise them you are doing so because of their interest in Reality TV, they will be utterly baffled. What’s on screen won’t resemble the thing they like in the slightest. Today, The Truman Show is much less like Reality TV, and much more like 50’s TV, but most like 1998 middling comedy movies. It is striking, to watch now, how much less affecting it is than it was. This is because in 1998, Niccol had to compete with other visions of the future that were being contemplated at the time, and his unique one could hold its own. In 2019, the movie has to compete with the actual world as we’ve known and lived it, and The Truman Show is hopelessly outgunned on this front.

In the long run, Niccol’s original, dark vision may have actually defeated Weir’s campy vision. In 2014 the psychiatrist Joel Gold and his brother, the neurophilospher Ian Gold, penned the book Suspicious Minds: How Culture Shapes Madness. This book is actually an excellent primer on delusional belief in human beings, its possible causes, and its many manifestations. But the hook that draws the reader in (and what came to populate press materials and news stories about the book’s release) was Joel Gold’s work with patients suffering from what he termed “The Truman Show Delusion.” This is precisely what it sounds like: a false belief that a person’s life is secretly a TV show, and that this reality is being hidden from them. While the term was coined for Niccol’s work, the patients in the book exhibiting this syndrome often seem to be inspired more by the actual Reality TV as we know it. They are people who crave the self-importance of fame. Nevertheless, these are dark stories. The people afflicted do not come off like Peter Weir’s Truman or a Kardashian. If they have an analog (if there is an art this life is most imitating), it’s much more so that dark Truman of Niccol’s original draft. But it’s not surprising that the actual movie would still appeal to them.

In 500 years, nothing in The Truman Show will seem like the technology of the future. Many people probably will live in enormous dorms, as colonization of other planets becomes a reality. Cameras will be more ubiquitous than the movie ever conceived; it’s even possible that 3D scanners will record every moment of all of our lives in a perfect facsimile of what was experienced. Even the weather-machine isn’t beyond the pale. Everything in The Truman Show will appear, then, as a snapshot of a past age. That being so, what will a person living in 500 years describe this movie as being about? I venture the answer is paranoia. This is a movie about a paranoid person who develops an absurd contrarian belief that doesn’t match reality, and which everyone who loves him tries to dissuade him from thinking. Of course in the end, he’s right. Just because he was paranoid, didn’t mean they weren’t out to get him. Or at least use him. This film is a very powerful statement for real people who are suffering delusions. It says that even when reality is constantly knocking back your warped perceptions, your hunch may actually be onto something, and in the end it will be validated.

It has to be said that delusions are never the product of their content. The Brothers Gold do an excellent job of explaining this. For example (my example, not theirs) a very clichéd delusion of the last several decades is the claim that the CIA put in a chip in your brain to spy on you. Obviously, this delusion can only exist in the world as we know it, where Congress chartered the CIA in 1947, and the microchip was invented through the work of Jack Kilby and Robert Noyce shortly thereafter. However, in a world where these things had never happened, the person who would otherwise have believed them is still not going to be a happy, well-adjusted one. He instead believes that the Illuminati are trying to assassinate him, or that General Motors wants to kidnap him and steal his revolutionary ideas, and so on. Seeing The Truman Show can’t make anyone delusional. The delusional simply turn on the movie and see a reflection of their sickness that functions as wild wish-fulfillment. Even in other movies that lend credibility to paranoia, you don’t ordinarily see the kind of everyman normalcy in the main character that Truman is imbued with. How could he be crazy? He’s the affable Jim Carrey.

That may last forever. Delusional people may still find this movie relatable in 500 years. But for regular people, The Truman Show looks less and less like the world they live in every year, not more and more. Certain things just jump off the screen as implausible, like when Truman is able to escape being tracked within his own world. Despite all those cameras, they can’t find him, because they don’t know where to look. This is a movie that is purported to have insight about where the world was going in 1998. I could say “blah blah GPS…” and make an overcomplicated argument, but a simple one will do. Here’s a question: if the writers of the aforementioned Crocodile Hunter were able to understand the technology of a tracking device, such that they could make it a plot point in their movie in 2002, which was for children, is it really reasonable for Niccol and Weir to ask that we believe Truman’s TV overlords wouldn’t have insisted on it? They have a fucking weather-machine.

In 2011, NASA named Gattaca the most plausible science-fiction film of all time. The impetus for even ruling on the issue was the then-newsworthy 2012 apocalypse conspiracy, which had spread much astronomical misinformation (remember the imminent return of Planet X?) and spawned a film of the same name. As you are now reading this, that movie did not predict the future, as the world could be concretely certain of when the year 2012 ended as predicted on December 31st, with the world still extant. But NASA felt the need to get out in front of the story (you imagine the Russians would have probably used a pencil and simply waited for the calendar to turn). In the 1960’s, it was a common refrain amongst contrarians to bemoan the fact that NASA was spending billions on spaceflight while there were still Americans living in poverty. Today, as vacancy on the moon is at historic highs, they’re apparently spending the money on ranking movies. Meanwhile, Jeff Bezos is managing to do both at once. The science honor for Gattaca wasn’t undeserved, because the science- of the film is painfully plausible, if not inevitable. Since Gattaca never speaks too much about how this is all being done, it doesn’t make any promises that are too big about the technology. The problem is, the -fiction is not especially realistic. It’s never clear why Vincent wants to become a celestial navigator and leave earth. Is it because they say he can’t? He’s not a defiant character otherwise. The closest explanation of his motivation seems to be evident in his need to beat his brother in swimming contests, and these are presented as being thematically important, but they don’t really shed light on his character the way one surmises Niccol might have intended them to. He’s a straight-laced square who fits perfectly into his Gattaca box. He has no problem living with the superhuman discipline necessary to protect his hidden identity. The only cracks in his armor (the times when he seems to have the emotional dimensions of an actual person) appear in his budding romance with Irene, which has the unfortunate timing of beginning less than a week before he is supposed to launch. But if this gives him any hesitation about leaving forever; if he is at all torn about making the old Irish exit and never seeing her again, we don’t see it. What awaits him in space? Why go?

Believe it or not, 2011 was not the first time NASA had offered official findings on the subject of Gattaca. In 2007, the agency published Societal Impact of Spaceflight, a scholarly collection that examined whether the last five decades of venturing forth from earth had been worthwhile for civilization. In an essay titled Are We a Spacefaring Species? Acknowledging Our Physical Fragility as a First Step to Transcending It, the cultural critic M.G. Lord confronted the human limitations on space travel (such as our ability to withstand radiation of the sort space is just famous for) and along the way checked Gattaca by name:

“Gattaca, a1997 movie written and directed by Andrew Niccol, describes a space program in which astronauts are chosen based on genetic superiority. In the movie, this is a bad thing. The hero is a love child, not the product of a eugenics exercise. Yet the hero’s short-term triumph—securing a spot on a desirable space mission—may seem less admirable if his weak genes expose him to a fatal illness. In the1970s, in vitro fertilization procedures were uncommon as well as ethically suspect. Today they are performed frequently and not considered an ethical problem. Likewise, other eugenics procedures that are not approved today may become commonplace in the future.”

Putting aside the bright picture the author sees for the future of eugenics, she is onto the movie’s fatal flaw. Vincent will probably die in space, proving his detractors right. It is fairly obvious in this movie that Niccol saw this story as an allegory about racism, or perhaps any of the unjust forms of discrimination that human beings are famous for. He has said in interviews that science-fiction appeals to him because it can be a vehicle for other ideas. But if that’s the case, Gattaca is an extremely short-sighted movie. Discrimination of the irrational sort has always haunted human experience, and much better artists have drawn light to it, particularly in the 20th century. What Lord sees, or at least sees in part, is that in a world with actual, fully-realized genetic engineering, there might utility in discriminating against someone like Vincent. The notion raises all sorts of questions (ones humanity is going to eventually confront) that do not get asked in Gattaca. Jude Law’s Jerome is a perfect human, and was a competitive swimmer, but became suicidal because he couldn’t do better than a silver medal. This is unrealistic. Somebody is always going to get the silver medal. Athletes confront this dilemma all the time. Someone who may be the best in high school will find in college they weren’t quite as good as they thought. And if they should be lucky enough to outshine most of the competition there, they will find themselves matched at the professional level. Even the greatest of all time are only marginally better than those who were their stiffest competitors. This is the world we live in. What Gattaca could have asked is if there would even be athletic competition in a world like this. Further, what if Jerome had just not particularly liked swimming? Or sports at all? Imagine if, instead of the movie we have, the story was about Vincent being genetically-designed to withstand radiation, and being prepared his whole life to do so, only to be conflicted because he loves Irene and wants to stay on earth with her? The actual Gattaca is very caught up in notions of superiority and inferiority, as though what a perfect genetic human would be is something that could be universally agreed upon. But this is never true. There are always too many variables, many of them qualitative. Conflicts like the one the Vincent of the imagined, hypothetical Gattaca faces, because it is predicated on an irrational, immutable human trait (romantic love) are going to be the interesting ones when the technology of the future arrives. That’s because these are the things that have always been interesting about the stories we tell each other, and the quality that defines those we canonize. When millions of students are forced to endure all five acts of Hamlet this year, and some find they actually like it, it won’t be because it provides some insight into the Elizabethan view of Danish Court politics, even if their English teacher wants to talk about this. Fiction, even speculative fiction, is interesting because of what it says about people. Hamlet is a play (mainly) about a young person confused over who they are, and what they should do. Young people will forever confront this dilemma, no matter how old Shakespeare gets.

In the modern era, one of the regular metrics applied to any piece of filmed entertainment from the past is whether that thing “holds up.” This is a shorthand to describe whether a thing, which was of quality upon its initial release, is still deserving of that designation at this point in time, when it is watched again. This can be applied to music or books, but most of the time it’s something you hear in conversations about movies or TV. In some ways, finding a thing doesn’t hold up is totally expected, like when you find that while you may have loved The Smurfs as a seven year-old, it’s not something with which you want to spend your time at 40. But discussions about the ability of something to hold up are much more nuanced than that. Today, people regularly find themselves reacting quite differently to something they watched only a decade ago, while they were in their adult body and brain.

Any discussion about speculative fiction in screen form would be incomplete without a mention of Rod Serling’s The Twilight Zone, the primetime sci-fi/fantasy anthology series that ran for 156 episodes from 1959-64 on CBS. The only thing more remarkable than the fact that this show somehow produced an average of 31.2 episodes a season which were judged to be largely good (often great) is the show’s ability to hold up, even sixty years later. Lest you think that they were widely distributing the workload, 128 of those episodes were written by just three people (Serling, Richard Matheson, or Charles Beaumont). Yet, these stories were (and are still) compelling to watch. What the future might be like was often the subject of that week’s story. Today, many of these still have resonance. The Brain Center at Whipple’s (episode 153) was about an industrialist who begins replacing his workers with advanced machines (not the only episode on the subject), only to suffer the same fate in a chilling final scene. The Trade-Ins (episode 31) featured an elderly couple who seek to exchange their failing bodies for younger models (also not the only episode with this concept), but are trapped in a The Gift of the Magi dilemma due to their lack of means. In the years since those episodes first aired, technology has progressed by leaps and bounds in both replacing human workers with automation, and in extending human life. Real people have had to deal with ethical questions about both much like those the characters faced, even if the technology still lags behind Serling’s vision. But with artificial intelligence and bioengineering continuing to advance, it seems very likely the tech will eventually get there, and beyond. Nevertheless, these episodes will always have social meaning, because they are very plainly trying to imagine what actual people would do in this world. When that world exists, people will continue to be people. They will not magically start making choices that defy reason like Gattaca’s Vincent, or Truman, or (God help them) Viktor Taransky. The Twilight Zone’s black-and-white production values may make it look antiquated, but those aspects of the show have been static for more than 30 years. My first TV (a hand-me-down) was black and white, but I am very much a child of the color age. If I find this show tolerable despite the fact that it appears old, anyone can. People will be able to watch this show for a long time, should they choose to.

And they have been. The show is currently airing its third reimagining on CBS’ All Access streaming service. That endeavor is being shepherded by Jordan Peele, whose recent Us was the highest grossing movie of 2019 in the United States that wasn’t based on a preexisting property. And that designation is only partially accurate: Peele was inspired to write the movie by a 1960 Serling-written Twilight Zone called Mirror Image (episode 21). In that particularly terrifying half-hour, a woman enduring a long wait in a bus station is suddenly confronted with a doppelgänger who wants to take her place, but (as is often the case) nobody believes her. The Twilight Zone inspiring other writers to create their own stories, which then become extremely successful at the box office, is not without precedent. Many claim it as an influence on their work.

The Truman Show bears many an absurd similarity to a 1989 episode of The Twilight Zone’s first revival entitled Special Service, even having its events set in motion when the main character (“John”, the Truman figure) inadvertently discovers a camera when a wall falls in the opening scene, like Truman with that production light. An irate John is then faced with the dilemma of whether he should quit (escape) his show, or go on pretending as though he had never learned the truth. By Contrast, Peele’s Us has so little in common with Mirror Image that it’s hard to picture anyone with a broad enough knowledge of science-fiction to have seen both connecting the two, without being prompted to do so. But that didn’t stop Peele from repeatedly mentioning Mirror Image and Serling while doing press for Us, though this may have been because his deal with CBS was already inked. Whether Niccol ever saw Special Service is unclear. But he and The Truman Show were nevertheless nominated for a Best Original Screenplay Oscar which, even in a possible world where Niccol had sinisterly stolen this story whole cloth from Special Service, it would still be a historical abomination that Truman somehow lost that award to Shakespeare in Love, a film which, 1) achieved its large-scale Oscar bounty through a definitely sinister campaign by aforementioned unconventional horticulturist Harvey Weinstein; and 2) I can promise you, definitely does not hold up.

It’s not by accident or changing taste that holding up has become a standard the culture asks media to meet at the present moment. If I were trying to write this essay 20 years ago about films that were the same relative age as those I’ve mentioned, I’d be woefully unable to do so in such detail. This is because at that time, here-today-gone-tomorrow was the nature of our experience of movies. Back then, one was generally limited to what was available in their local video store. That video store had a mix of VHS tapes made up of new releases, recent releases, and timeless favorites. Shelf space was at a premium, and it would be the unwise video store proprietor who displayed titles that nobody was likely to rent. A title would be released and spend a limited amount of time in the new release section before moving to the appropriate genre section, and then it would disappear forever. Very few movies achieved relative timelessness. These were films like the Star Wars trilogy, the Disney animated canon, and various similar others, but never too many. It just all evaporated within five years or so, as more new and recent releases needed that space. Even if you lived in a city with a lot of specialty video stores (typically just New York or Los Angeles) you would still be limited to the titles the operators of those stores thought appealed to their specialty customers. So, it is possible I would be able to locate The Truman Show in a Blockbuster, even though this is unlikely. Gattaca would not be there, but I could probably find it in a specialty store. Simone? Absolutely not. In 1999, a 20 year-old Simone would be out of print (and assuming it came out in 1979, it would probably never have been in print on VHS). It would be something you only heard stories about, if anyone remembered to tell them. It is not a movie you would even be likely to see on cable. If I wanted to write anything about it, and I didn’t somehow have access to a film print, I would be doing so purely from memory. And while my opinion of the actual movie hasn’t substantially improved or diminished since 2002, there was much I found I had forgotten when I sat down to rewatch it a week ago. And television? Forget about it. There was simply no practical way to ever see a TV show again. You could buy things like M*A*S*H or Star Trek through mail order at an exorbitant cost (just 1-3 episodes on each tape, and each tape ran about $20), but that was it. The Twilight Zone was available intermittently in this form, with a limited selection of episodes. When I was a kid, my video store had one tape of the series with four episodes on it, and that was all I experienced of it until I was nearly an adult, and my parents got the Sci-Fi channel around 1996, who were then re-airing it and still do so today. The idea of purchasing a “complete season” was unthinkable.

This all began to change, both for movies and TV, with the introduction of DVD, also in 1996. While it was initially a niche product for the high-end market, the economics of the format made it fated to outgrow that designation. DVDs were smaller and simpler, destined to become cheaper to manufacture, ship, and display for sale or rent. Meanwhile, online retailers like Amazon (and Netflix in the mail-order rental market) could offer unlimited shelf space. A host of things nobody had seen in years were suddenly available again, and people were first confronted with the reality that a thing they had once loved was actually terrible. Of course E.T. was still great, and that is why your video store had probably held onto their copy since its initial release on VHS, or repurchased it during one of its rereleases on that format. But on DVD, viewers could realize The Beastmaster (which they may have enjoyed side by side with E.T. in the summer of 1982) wasn’t quite as good as they recalled. The irony is, of course, that DVD now seems like something immediately antiquated. One day, its short reign will be a blip in the history of home media. Streaming video is now the standard way to enjoy these things, and it is likely to remain so for many years into the future, for probably at least as long as movies and TV themselves remain something we amuse ourselves with. The services and the business models may change, but if you are enjoying a movie in 100 years, it will be data that enters your home in an electronic form, rather than being inscribed on a medium like a disc or tape or anything else you obtain elsewhere and put in a machine.

“The future is now! Soon every American home will integrate their television, phone and computer. You’ll be able to visit the Louvre on one channel, or watch female mud wrestling on another. You can do your shopping at home, or play Mortal Kombat with a friend from Vietnam. There’s no end to the possibilities!”

These are the words of Jim Carrey, spoken on a rooftop, during the climax to a film. But Andrew Niccol didn’t write them. They are from 1996’s The Cable Guy, a movie that was not a huge hit with the critics, nor the audiences who saw it. It could better be said to have introduced Jim Carrey as an actor in commercially-underperforming movies. It did not catapult first-time screenwriter Lou Holtz Jr. to Niccol-style acclaim. He was a prosecutor before writing The Cable Guy and creating a bidding war among Hollywood studios, and he never wrote again, instead reactivating his California Bar license in 2000 and returning to the Los Angeles District Attorney’s Office. It’s unclear whether the afore-quoted monologue, which appears twice in the movie, was specifically written by Holtz, who has the sole writing credit on the film but ceded the project to producer Judd Apatow and director Ben Stiller during the process. Unlike most, I found this movie immediately hilarious the summer it was released (Stiller’s dual cameo as the Menendez Brothers/OJ Simpson-like twins Stan and Sam Sweet is unforgettable). I venture that this film holds up, as much as a comedy ever can. If you disagree, try streaming Ace Ventura or The Mask today, which were perceived as amongst Carrey’s superior, funnier efforts at the time. They are not funny today. He is irritating in them. In fairness (or unfairness), future generations will probably take no notice whatsoever of any of the films mentioned in this paragraph, and if they do, it will probably be purely for Carrey’s grossly homophobic caricaturing, and they’ll only take such notice so they can unearth his corpse and posthumously cancel him, a pastime which shows no signs of going out of style.

My appreciation for The Cable Guy has not changed, but one thing that has is my respect for the dialogue I quoted, as well as a handful of other things about Jim Carrey’s socially-bizarre cable installer Chip Douglas. The monologue itself has gone, in less than three decades, from being something exaggerated for comic effect, to being something inevitable, to being something unremarkable because it’s undeniably true. Spectrum, my local cable company, has been trying to upsell me on their “bundle” of phone, internet, and TV for at least the last decade. I currently pay them only for the internet, but nevertheless I receive all of my TV through it and while I lack a landline, it is inevitable that before I die (probably much sooner) my internet provider will be the provider of my cell phone service, or vice versa. Carrey’s monologue even seems to predict, in terms of content, the refined highs and crass lows this new age would mark. Latest series entrant Mortal Kombat 11 was released this year, and rest assured, you can play it with a friend in Vietnam. But more than all of that, Carrey’s Chip Douglas is a weird loner who replaces his inability to maintain real relationships with pop culture references. Today, the world has never been so full of these people. This is like three quarters of what’s on Twitter at any given time. On balance, this forgettable movie shines much more light on what has defined the subsequent years than anything Andrew Niccol brought to the screen.

Of course, The Cable Guy wasn’t trying to predict the future. It doesn’t imagine a whole world apart from our own. It simply highlights one corner of the current world we live in, media, which has come to dominate nearly all others, and speaks with prescient clarity about the implications of where things are poised to go. If there are movies that predict the future in any way worth paying attention to, it is ones like these. While The Cable Guy is meant to make you laugh rather than think, and such predicting is inadvertent, there is another sort of movie that highlights what we should notice in our present as being important for the future head on.

When I contemplate historians examining our age in 100 years, there is one question I have about how they’ll see things, more than any other. I wonder if they will describe 2019 as a year when America was at war. If the answer seems obvious, I promise it is not. While the US has not formally declared a war since 1942, the easiest answer is to say yes and cite the war in Afghanistan, which began weeks after 9/11 and has continued for 18 years, making it an inevitability that an American will soon die there who was not born when the World Trade Center fell. But therein lies the complicated part. 18 years is a lifetime, however short of one it might be. At this point, the war in Afghanistan has become far and away America’s longest war. At what point will it stop being an exceptionally long war, and just become the status quo? It may end, and that end may even come this year. It seems unlikely, but recent developments like the on-again-off-again possibility of an agreement with the Taliban have made an end appear likelier than it has been in a while. But even if it is capped at 18 years, how much will change about American military conflict abroad? The military is engaged, and has been engaged, in combat situations in countries all over the map on an ongoing basis in recent years. These include Afghanistan, Iraq, Syria and Libya. Those conflicts have all spawned a lot of news. But there are countless others that rarely warrant mention. Some of these are reported to be actual war zones where our military functions purely in an advisory capacity, but anyone who saw this movie the first time, when it was called Vietnam, always takes that designation with a grain of salt.

In October of 2017, an ambush killed four US Soldiers in the African nation of Niger. The attack by Islamic militants was the deadliest one, up to that point, in Donald Trump’s young presidency, and when he did not make a statement about it for 12 days, criticism in the press and elsewhere began to mount. When a reporter asked about it during a press conference, Trump made a rambling statement that, while probably not designed to escalate the controversy, probably couldn’t have been better designed to do so. Effectively, Trump congratulated himself for communicating with the families of the deceased service members, and took unveiled shots at his predecessors (Barack Obama by name) for not having done so, or not having done so as perfectly as he. What he meant to say specifically is arguable, but to call the broad point that he was claiming to be better than Obama on this issue is pretty plain. The remarks were immediately criticized for being inaccurate and inappropriate. Americans who noticed the controversy seemed to immediately take their political battle stations. If you liked Trump, he was correctly highlighting a difference between himself and Obama and anything otherwise was the words were being taken out of context. If you hated Trump, he was a petty embarrassment to the Presidency, who proved by the untrue words that he was not qualified to succeed Obama. Voices on both sides of the mainstream politicized the controversy in the following days. However, and this is what is notable, any controversy about the appropriateness of a military presence in Niger was quickly relegated to the fringes. Nobody disputed that it was a tragic loss, but the necessity of that loss was not something Trump or any of the myriad other politicians with a hand in foreign policy were held to account for. There were questions about how this could be avoided in the future and demands for an investigation by John McCain and others (which did occur), but the idea of avoidance of such deaths by avoiding war in Africa was not elevated to a position of any meaning. By any analysis, Trump’s press conference was more controversial than the military action that was its catalyst. That a thing like this would happen, in a country many Americans have not heard of, let alone are not aware American soldiers are serving in, was and is the status quo. Republican Senator Lindsey Graham warned to expect “…more, not less…” American military operations in Africa as the War on Terror begins to “morph”, and that work begun on the continent by Obama would be expanded. Democratic Senator Richard Blumenthal said of the lost personnel, “I could not look those families in the eye,” but he wasn’t talking about sending their deceased loved ones somewhere questionably appropriate to sacrifice their lives. He was instead chalking the catastrophe up to a shameful failure to provide sufficient intelligence that could “…enable them to be successful in their missions…” By the voices that mattered in both parties, it was taken for granted that those missions were above reproach.

The foregoing being the case, for the question of whether this will one day be regarded as an age of war for this country, does it matter whether the approximately 14,000 US troops presently in Afghanistan are there or not? They represent fewer than a quarter of US troops currently deployed just in the region, and substantially less than a tenth of all US military personnel currently stationed abroad. The fate of the “Afghan War,” can be set aside. The real question to ask is whether this scenario, taken as a whole, will be regarded as war. It has existed on a rolling basis since 9/11, and those who take the position that it existed even before that do so in good faith. There is no reason to believe that it will change. It may inevitably change, but there’s no path to that change in sight. So it is wholly possible that in 100 years, those historians will be living in a continuation of this status quo. So what will they call it? Like America’s other endless, abstract war, which is the War on Drugs, the term “War on Terror” will probably eventually be abandoned for policy reasons. President Obama quietly ended the War on Drugs in 2009, but left the prosecution of that war almost the same as he found it. The term was what was objectionable, and I predict the War on Terror will eventually meet a similar fate. But if, absent the name, in 100 years this new status quo or any version of it persists, it will be only that. Ongoing combat operations in a far away theater will not automatically equate to war, as the term has historically been understood.

It may be a question of scale that allows these wars to continue. It should be noted that the loss of four Americans in a single incident in 2017, while tragic, was happening with regularity when Afghanistan and Iraq were at their heights. At that time, the country was widely perceived (the point wasn’t argued) to be at war, and if there wasn’t quite enough opposition for the tastes of those so opposed, it was a vocal component of the mainstream. President Obama’s early opposition to the Iraq War, a minority position at the time of the 2003 invasion, was in no small way a part of his appeal as a candidate for the Democratic nomination in 2008. That he became President, and that by the time he left office something like Niger was noteworthy enough to raise eyebrows, should signify a major accomplishment for those who consider themselves to be anti-war. If America was still at war, it was undeniably less so.

Perhaps when asking this question in 100 years, historians won’t primarily concern themselves with the conflagration of relatively small, unstable countries where these limited firefights are happening. They may instead look at the broader trends, the world powers, and a very untraditional sort of war.

Earlier I said there are other movies like The Cable Guy, ones which point out to the viewer something important to see about the world for the future, but do so not in inadvertent comedy. Instead, these films showcase change that is happening in the present with earnestness. If one were looking for films like these which bear on the question of where American war policy is going, the two most prescient films I have been able to find are both (and the only) collaborations by screenwriters Lawrence Lasker and Walter F. Parkes, WarGames (1983) and Sneakers (1992).

In June of 1983, MGM released director John Badham’s WarGames. If you saw it at that time, or for several years thereafter, you may recall WarGames being the first piece of entertainment in which you encountered the concept of computer hacking. The film was ahead of its time enough that the word doesn’t appear in it. While “hacking”, used in this context, can be traced back to at least the 1960’s, the word was not in the popular lexicon, particularly not before the 1984 publication of Stephen Levy’s history of the pursuit, Hackers: Heroes of The Computer Revolution. In WarGames, we only know the hacker protagonist (Matthew Broderick) as David Lightman, a suburban Seattle high school student whom, although smart, is without much interest in his classes. He’s much more engaged with the computer in his bedroom, and the attached device (and here’s what most people hadn’t seen before) that allows the computer to utilize the phone to call other computers, and communicate with them. There’s no talk of the internet in this movie, even though an audience seeing it today would immediately see it as primarily about the internet. David is primarily concerned with using the procedure to find new games to play, but he engages in a fair amount of seemingly harmless mischief as well. When he learns about a gaming company with exciting new games soon coming to the market, he makes it his mission to dial various numbers where they are located hoping he can gain access to their computer and sample the games ahead of time. When he finds he finds an ominous menu of games, he can’t believe his luck, and selects one called “Global Thermonuclear War.”

Of course, it isn’t the gaming company he’s reached, but a highly advanced government computer named Joshua. And the game isn’t merely that, because this computer has access to America’s nuclear arsenal. David tries to abandon what he’s begun, but Joshua can’t. The fate of the world is suddenly what’s in play.

The 1980’s were the hottest time in a generation for the Cold War. In the ever-reliable prism of retrospect, this is hard to believe—Russia was not as powerful and eager to fight as they seemed, and neither was Ronald Reagan. It may be hard for people born subsequently to visualize, but if you were alive then, or if you now get your history primarily by using Phil Collins videos as primary sources, a nuclear conflict that would level the world seemed entirely plausible. WarGames is often comedic, but in its presentation of Cold War tension, it’s deadly serious. The film opens by showing us an underground command center where nuclear missiles are controlled, and two low-level Air Force officers suddenly met with orders to activate a launch. What they don’t know is that they’re being tested. But neither does the audience. This is an extremely tense moment in a movie with its share of levity. Ultimately, one of the officers (The West Wing‘s John Spencer) is unable to fire the missile because he can’t be sure. This, the command structure comes to be persuaded, is why as much control as possible needs to be put in the hands Joshua’s unflinching AI. While arguments in science-fiction about the ethics of AI are older than WarGames, this very practical application of those questions to present day national security was not something one encountered very often at the summer cineplex in 1983. The Terminator was still more than a year away, and that film doesn’t really ask a lot of ethical questions about the feasibility of AI, it more just asks the audience to take for granted that an AI would be an unflinching killing machine bent on exterminating humanity, without elaborating on why or how. WarGames at least explains the principle of unintended consequences by taking us through the steps of how it might happen. Joshua is not evil. He has none of the menace of pre-Twins or Kindergarten Cop Arnold Schwarzenegger. If he is going to end the world, it’s going to be by accident, which is probably the only way the world was ever going to end as a result of Cold War tensions.

A lot of people saw this movie. It made nearly $80m in 1983 dollars on a $12m budget. It made a lot of those people think about issues of nuclear threat and where society was going with computers. That’s obvious. What is less obvious, and completely true, is that one of those people was Ronald Reagan. Then-president and former B-movie actor Reagan was fond of watching contemporary movies, and he screened WarGames at Camp David the weekend it was released. Reagan was very unsettled by what he saw in the film. Days later, in a White House meeting, the President asked General John Vessey, the Chairman of the Joint Chiefs of Staff, “Could something like this really happen?” When Reagan began to recount the events of the film, eyes rolled. The President might as well have been raising the national security implications of Tron for the 16 senior members of Congress present. But one week later, General Vessey returned with a startling answer: “Mr President, the problem is much worse than you think.” Hackers were real. The national security establishment was not secured against them. It wasn’t a coincidence. To write WarGames, Lasker and Parkes had done their research. Among the people they interviewed was Willis Ware, who for years headed the computer science department at the RAND Corporation, and who had authored a paper (one of many) way back in 1967 entitled Security and Privacy in Computer Systems, which outlines the dangers of remote access to networks with eerie foresight, examining the military dangers, but also the (increasingly important today) dangers to consumer privacy.

It was probably a good thing Reagan chose WarGames over Psycho II or Octopussy, out the same week, because his shot in the dark to General Vessey did lead to real policy changes. There was a significant revamp of cybersecurity at the Pentagon, and Congress subsequently passed anti-hacking laws, the hearings for which included the playing of clips from the movie. Nevertheless, that didn’t stop the future events contemplated in WarGames from becoming very real.

In February of 1998, the Department of Defense was rocked by a number of intrusions to its networks. The attacks were widespread and appeared to come from overseas, occurring at odd hours. Since President Clinton was preparing potential military action against Iraq at the time because of disputes over UN weapons inspections, and some of the attacks originated in the Middle East, the two things were suspected to be related. The government launched a multi-agency investigation codenamed “Solar Sunrise.” A 24-hour watch was set up to catch the intruders, and when it worked, the government was surprised to find: David Lightman. The intrusions had been committed by two high school students in Northern California and a 19 year-old hacker in Israel. The odd hours of the attacks coincided with school dismissal time on the West Coast.

Nuclear close calls and computer errors were also very real, even if we didn’t know it at the time. On September 26, 1983, while WarGames was still in theaters, a Soviet satellite early-warning system reported the launch of up to six American nuclear missiles. Tensions were high, and the possibility of an American attack real, as the Soviet Union had three weeks-prior shot down Korean Airlines flight 007 from New York to Seoul, when it drifted into Soviet airspace. Nevertheless, Lieutenant Colonel Stanislav Petrov of the Soviet Air Defense Forces decided the alert was most likely a false alarm and did not report it to his superiors, in defiance of orders, just like John Spencer’s character in WarGames. It was subsequently determined that the Soviet early-warning system had been triggered by a rare alignment of sunlight on high-altitude clouds. The incident was not publicly known until the 1990’s, and Petrov was honored by the UN in 2006 for having possibly averted a nuclear war.

Sneakers, almost a decade later, was less-noticed than WarGames. While its overall grosses were in line with those of its predecessor, it was easily outshined at the box office by fare like Batman Returns and Unforgiven when it was released on the then-unremarkable calendar date of September 11, 1992. Though this is a very commercial comic “caper” action movie, it probably suffered from having a leading man who was fast approaching 60 years-old at the time. Robert Redford (born 1936) portrays baby-boomer Martin Bishop (née Brice), who has been on the run and living under an assumed identity since narrowly escaping police in 1969 while then a (presumably not 33 year-old) college undergraduate hacking the Republican National Committee and the Federal Reserve. Bishop fled the scene in Volkswagen van, leaving behind his partner in crime Cosmo to face the authorities, despite the fact that people born in the mid-1930’s didn’t ordinarily drive Volkswagen vans unless their name was Charles Manson. But the discrepancy over Redford’s age (presumably the interest by a star of his calibre in this project was more important) is easy to forget once the movie gets going. In modern day, Cosmo has died in prison, and still-hiding Bishop commands a crackerjack team of “sneakers”, security professionals with shady backgrounds who are hired by businesses to infiltrate their operations and expose vulnerabilities. This industry exists in the real world, and is a form of white hat (non malevolent) hacking, often referred to as “penetration testing.” In the film, the team is populated by a stellar cast of supporting players which includes Sydney Poitier, Dan Aykroyd, River Phoenix, and David Strahairn, all with some exceptionally fleshed out personalities for a movie in this normally uninventive genre. Bishop is retained by agents from the National Security Agency to steal a device, the invention of a scientist (Donal Logue), which the NSA explains has been created for the Russians. If the operation is a success, the NSA will not only pay $175,000 (screenwriters endowed rewards so much more modestly in 1992) but also clear Bishop’s (Brice’s) name in his old case. When Bishop pulls off the caper, and the team returns to the office to celebrate, they discover the device is more than what they were led to believe. It is, in fact, capable of bypassing encryption and allowing access to seemingly any protected data in the world. Soon, Logue’s character has been murdered and it’s revealed their client isn’t the NSA at all, but instead a still-living Cosmo (Sir Ben Kingsley), whose plan was not only to secure the device but attain revenge.

There are a lot of levels to what’s happening in Sneakers, and how it foretold future events. For one, many people in 1992 were unaware the NSA existed. The agency, which historically some in government had only half-comically called the no-such-agency, had not (then) been the subject of the myriad successes (and scandals) for which the more operational CIA and FBI were already known. The NSA’s dominion, in the intelligence community, is communications, and communications were much less important in 1992, when most people had never heard of the internet either. That changed with the 1990’s rise of the home computer and later cell phone, and particularly with the general expansion of the national security state that occurred in the years after 9/11. But if the public failed to notice the prescience of Sneakers when it was released, the same couldn’t be said of the NSA.

“The world isn’t run by weapons anymore, or energy, or money, it’s run by little ones and zeroes, little bits of data. It’s all just electrons… …There’s a war out there, old friend. A world war. And it’s not about who’s got the most bullets. It’s about who controls the information. What we see and hear, how we work, what we think… it’s all about the information!”

These are the words of Ben Kingsley, also on a rooftop, spoken to Robert Redford in the climax to Sneakers. One of the people who went to the theater and saw this movie in 1992 was John Michael McConnell. “Mike” McConnell was no ordinary audience member, or at least not ordinary as a member of the audience for this movie. He was the recently-appointed Director of the National Security Agency. And rather than take offense to Lasker and Parkes’ depiction or reference to his agency in the film, he found it extraordinarily insightful, and Kingsley’s monologue gripping. McConnell advised all of his coworkers to see Sneakers, and he even obtained a copy of the film’s last reel and screened it for the agency’s top officials, telling them that this was the vision of the future they should keep foremost in their minds.

…There’s a war out there, old friend. A world war. And it’s not about who’s got the most bullets. It’s about who controls the information.”

But what happened directly in national policy as a result of WarGames and Sneakers were mostly still background events. The two American wars that followed 9/11 in Afghanistan and Iraq were more or less traditional, revolving around invasions by troops and live fire. But beneath the surface, a war, perhaps even a world war, much more like the kind Kingsley spoke of was beginning.

Today, the United States is in a major war with at least four other countries. But with one emerging exception, no shots are being fired, and very little discussion is had publicly about the nature and scope of this war. The main belligerents are China, Russia, Iran, and North Korea. There are others, and virtually every nation on the planet has some involvement in this war, but these are the ones who have attacked American targets most directly. If you picture David Lightman when I mention computer hacking, you’re both right and wrong. Many of the participants in this war are not unlike him. However, the attacks they are engaged in are not mischief, and the danger is not of a misunderstanding like that character had with Joshua. But the worst-case scenario could one day be on par with that which is narrowly averted in the final minutes of WarGames.

In 2008, a senior Chinese diplomat contacted the presidential campaign of Senator John McCain to express his displeasure. The official was miffed about a letter from McCain to newly-elected Taiwanese President Ma Ying-jeou, in which candidate McCain had pledged his support for Taiwan. The call from China left McCain Asia policy adviser Randall Schriver surprised, to say the least, because the letter in question had yet to be sent. “He was putting me on notice that they knew this was going on,” Schriver would later tell NBC News, “It certainly struck me as odd that they would be so well-informed.” While the Chinese would later deny it, the truth was that McCain’s campaign had been hacked by the People’s Republic, as had Senator Barack Obama’s. China’s goal was to export massive amounts of data from the campaigns, including policy papers and personal emails. These were just two attacks by China noteworthy because of their political gravity. China has conducted countless cyberespionage intrusions into American businesses for the purpose of stealing industrial secrets.

The 2008 attack, widely revealed by 2016, should have (but didn’t) put candidate Hillary Clinton’s campaign chair, John Podesta on notice to be extra vigilant. During that campaign, Podesta fell victim to a spear-phishing attack. Phishing is an attempt to get a victim to give up personal information. Spear-phishing involves targeting a particular victim. A very ordinary phishing attack involves clicking a link in an email, which brings the user to a site they are tricked into believing is one that they trust. In the Podesta attack, it was his personal Gmail account that was compromised, with the hackers having been able to steal his Gmail password because he, wrongly believing himself to be logging into Gmail, willingly gave it to them. The hackers then stole thousands of emails, which they would use over the course of the final days of the 2016 election in an attempt to humiliate Clinton and aid Donald Trump. Subsequently, after a study of the attack, the intelligence community and independent security experts laid the blame at Russia’s door. Specifically, the collection of Kremlin hackers commonly called “Fancy Bear.”

Some attacks have been against private, non-government actors, but nevertheless for political purposes. In February of 2015, a massive cyber attack was launched against computers belonging to the Las Vegas Sands chain of casinos. The company lost access to email and phones, and hard drives were wiped clean at a rapid rate. While customer credit card and other data was stolen, the target of the attack was not selected at random. In October of 2013, Las Vegas Sands founder and CEO Sheldon Adelson had said during an event at Yeshiva University that the United States had to take a hardline position to combat the emerging nuclear threat in Iran, going so far as to advocate the detonation of a nuclear bomb in unoccupied desert as a show of force. Adelson subsequently claimed he had spoken in hyperbole, but his comments did not go unheard in Tehran. The Ayatollah Khamenei responded that, “If Americans are telling the truth that they are serious about negotiationthey should slap these prating people in the mouth and crush their mouths.” The cyber attack, which caused about $40m worth of damage, was traced to Iranian hackers connected to the nation’s Revolutionary Guard Corps. The hack came with messages targeting Adelson left on computers, such as “Damn A, Don’t let your tongue cut your throat.”

In November of 2014, film studio Sony Pictures was hit by the blistering public release of internal, confidential data from a group identifying themselves only as “Guardians of Peace.” The release included not only corporate information like internal emails (many of which were extremely humiliating for the studio) but also personal information about employees and their families. Guardians of Peace had stolen the information through a hack against Sony. If there might have otherwise been any confusion about the identity of Guardians of Peace, it was precluded by their demand: that Sony halt the release of The Interview, a comedy starring Seth Rogen and James Franco as filmmakers tasked with assassinating North Korean dictator Kim Jong-un, which was slated to hit theaters the following month. The hackers also threatened terrorist attacks on theaters showing the film. This latter threat was effective in forcing theater owners to decline showing The Interview, and Sony had no choice but to pull the release. However, the studio did distribute the movie online, where it quickly became the most rapidly successful film ever distributed that way. Nevertheless, the film was a huge commercial failure for the studio. During the fiasco surrounding the hack and release of data, actor George Clooney and his agent, Bryan Lourd, sought to stand up to the blackmailers by circulating a petition pledging support for Sony amongst the top echelon of the entertainment industry. Clooney has long been politically active, and Hollywood is a community never shy about showing support for important causes. Clooney believed the hack and demand was an attack on free speech, a particularly important issue for artists and media. Hollywood universally disagreed, or at least refused to agree publicly: not a single person contacted was willing to sign the petition. To Clooney, it wasn’t that they supported North Korea’s actions, but were instead simply not brave enough to risk that they would be Guardians of Peace’s next hacking target.  “They know what they themselves have written in their emails, and they’re afraid,” he told Deadline.

These attacks are all shocking and invasive. They are accurately described as acts of global terror. Nonetheless, they were purely attacks on information. Even where there were real world consequences, they were measured in dollars, not lives. This is probably why they were not more newsworthy when they occurred, though all were reported. Comparing them to WarGames’ high stakes of a nuclear war may seem absurd. But hacking by governments (or anyone else) is not limited to information attacks. Much more dangerous, and growing in number, are attacks on industrial control systems. This is the technology that operates things. This is the computer that controls a self-driving car, or a hydroelectric dam, or increasingly, every appliance in the home. A properly executed hack on technology like this could be devastating in lives lost.

As I was writing this essay, Iran and the United States came very near a very hot war. Iran-backed militants attacked the US embassy in Iraq, then a drone strike hastily-ordered by President Trump killed Iran’s top general, Qasem Soleimani. Iran responded by firing missiles at targets in Iraq very near US soldiers, but none were hurt. It now appears likely that hostilities are lessening, and a return to the status quo is the most likely outcome. But listening to the commentary over the week, which often contemplated a World War 3 breaking out, I wondered what the speakers believed that status quo to have been prior to the sudden hostilities. Iran and the United States have never enjoyed peaceful relations in the 40-year modern history of that nation. In a plan at would have surely killed many Americans, Iran plotted in 2011 to assassinate the Saudi ambassador to the US with a bomb at a Washington, DC restaurant, only to have their agents apprehended by the FBI prior to carrying out the attack. As recently as June 2019, Iran shot down a US surveillance drone (the airspace position of which is disputed by both nations) with a surface-to-air missile. Iran is accurately said to be the world’s foremost state sponsor of terror, both in Iraq and elsewhere, and the 2015 nuclear deal (for the brief time it lasted) didn’t bring a solution to these problems. However, when I consider the state of war between the two countries, it isn’t these ambiguous misunderstandings or proxy fights that come to mind.

A decade ago, the United States and Israel launched what is widely regarded as a sea-changing attack in cyber warfare. This was the 2010 Stuxnet attack on Iran’s nuclear program. These countries have never claimed responsibility, but nobody doubts it, and nobody suggests an alternative theory with credibility. Stuxnet was an attack on industrial control systems, which exploited then-unknown vulnerabilities in the Windows operating system; what are called “zero day exploits”, because the software company (here, Microsoft) is unaware of the vulnerability and has thus had zero days to respond to the problem, making an attack almost sure to succeed. Specifically, Stuxnet was a software program that infected computers controlling centrifuges used by Iran to refine uranium capable of being used in nuclear weapons. The computers targeted were not connected to the internet, so Stuxnet was (somewhat brilliantly) engineered to spread via USB drives, from system to system, laying dormant until Stuxnet was sure it was in an Iranian computer controlling this type of machine, then striking. Its attack was subtle enough as to make Iran think its centrifuges were merely defective, and continuously replace them. It appeared to be working, until independent cybersecurity researchers discovered the program laying in wait on unrelated computers, and began to deconstruct it and figure out its actual target and objective.

When Stuxnet was revealed, there was very little talk about the action being a violation of Iran’s sovereignty, let alone an act of war against the country. This is perhaps because of the refusal of the attackers to reveal themselves, but also because preventing Iran from achieving a nuclear weapon was a laudable goal to many in the international community. But as a precedent, Stuxnet was dark one. It cannot be assumed that Stuxnet was the first attack of its kind because, since its revelation was so improbable, there’s no reason to be sure something of this kind hadn’t happened before, and gone undetected. But surely it was the first widely known. The precedent, therefore, is that attacks between nations on industrial control systems are now in-play. Stuxnet’s specific origin from within the intelligence apparatus of the United States is rumored to be Tailored Access Operations, an elite and secretive unit of the NSA.

Stuxnet was limited in its reach, and didn’t target people, let alone civilians. Attacks that have followed haven’t been so reserved. In December of 2015, systems that controlled Ukraine’s power grid were hacked, and hackers were able to shut down electricity to almost a quarter of a million people. This attack, like hundreds of other cyber attacks against Ukraine and other nearby nations before and since, has been widely attributed by security experts in government and private industry to Russia. In a move to respond to the 2016 Russian attacks on America’s election, The New York Times reported in June of 2019 that the US is currently (secretly) deploying similar weapons to gain access to Russia’s power grid.

These attacks are still a far cry from gaining access to a country’s nuclear arsenal like WarGames supposed, but it isn’t unthinkable either. There are situations where merely cutting the power will result in deaths, possibly even on a large scale. Imagine if you could disable primary and backup power to vital life support systems in a hospital. In the future, should space colonization become a reality, such attacks could foreseeably cut off an entire population’s access to life-saving oxygen. In Star Trek, massive ships fire photon torpedoes. In reality, would they even bother, if an attack on a ship’s networks is easier to execute and has a higher probability of success? One day, cyber war may be the only war.

China, Russia, Iran, and North Korea are all major powers. But relative to one another they can seem much smaller, and probably their power decreases by orders of magnitude as you read through that list. But the cyber war is waged on a very equalizing plane. The players are limited only by their cunning. These countries have all struck the United States, and the United States has doubtless struck back (or struck first). Countless other nations have been on both the giving and receiving ends of cyber attacks, but these four seem most intent on (or most effective at) targeting America. At the present time, half the list has nuclear weapons as advanced as any in the world. North Korea claims to have nuclear weapons and is well on its way to making them deliverable. Iran will almost certainly have nuclear arms in the next 20 years. Even if President Obama’s 2015 nuclear deal had lasted, it would have only forestalled their development for 15 years, and five have already come and gone. If Iran does so succeed, they would only be the tenth country on the planet to have the bomb. Make no mistake, the cyber war is a world war amongst the world’s most powerful nations.

If this all seems like a very America-centric view, it is. America hasn’t been the site of actual combat with a foreign nation since World War 2, and even then the fighting was effectively limited to the attack on Pearl Harbor. A person living in a country I haven’t deemed relevant enough to yet mention, who looks skyward to see American drones firing missiles downward, couldn’t be blamed for concluding their nation is in a very real war with the United States. But while that may be the new war-like status quo about American intervention, however sporadic, it isn’t where the real power struggles for international control will take place going forward. Like everything else in our culture, that fight is going to play out online. I don’t know where or when the next 9/11 will happen, but I suspect it will come from a far-away nation, where even now a figure is seated before a screen, patiently plotting, their fingers grazing the keys along with mine.

Lasker and Parkes were not alone in drawing light to the coming information war. Many professionals, the same people who foresaw the importance of the internet writ large, made predictions of this sort. But in the national security context, 9/11 Cassandra Richard A. Clarke warrants mention again. Clarke has spent easily as much time during his prolific career warning of this threat as he has talking about al-Qaeda. In 2000, when Clarke was serving in the Clinton administration, he invited a long-haired hacker named Mudge to a White House summit on web security. Mudge was the lead personality in the Boston hacker think tank The L0pht, and closely identified with the famed hacker collective The Cult of The Dead Cow. At the meeting, the casually-dressed Mudge warned the more formal cabinet members and industry professionals about the danger of how little protection secured the nation’s technological infrastructure. If they, or the President, failed to take Mudge seriously, that wouldn’t always be the case. In the post 9/11 world, Mudge cut his hair, and increasingly used his real name, Peiter Zatko. He is now well-known for the work he subsequently did in developing tools and protocols for the Department of Defense, creating both defensive and offensive cyber security strategies. Thereafter, he went to work at Google.

If Mike McConnell, or anyone else at the NSA, drew inspiration for the direction of the agency from Sneakers, perhaps they were listening too closely. In the film’s final scene, the ‘real’ NSA shows up to collect the powerful decryption device that Bishop and crew have by then stolen back from Cosmo. As if Sneakers needed to add another renowned performer, James Earl Jones appears as a mysterious representative of the agency who wants the technology. Redford, remembering information he received earlier in the movie, points out that the device would be no good to use against the Russians, because “…their codes are completely different than ours.” This has the effect of diffusing any speculation on my part as an audience member that the technology of the movie might by some theory by plausible, but that isn’t way the scene is worth noting. The only thing the device would be good for, the Sneakers surmise, “…is spying on Americans.”

In 2013, NSA contractor Edward Snowden revealed, in what many had suspected since the War on Terror began, but perhaps no one outside of the government had comprehended the far-reaching scope of, that the agency (in cooperation with other intelligence agencies, the US’ Five Eyes partner nations, and several large communications corporations) was engaged in mass surveillance so pervasive as to allegedly touch the lives of every American. While the government’s version of the story and Snowden’s are not a precise match, no one disputes that the tens of thousands of documents he revealed are genuine. Many anti-surveillances crackpots have made startling claims about the surveillance state since the War on Terror began, but far fewer have been indicted for espionage, like Snowden has been, though he currently enjoys (though it is unclear how enjoyable it is) asylum in Russia thanks to Vladimir Putin. That being the case, it is hard to ignore how chilling his descriptions of the NSA’s operational capacity can be. “I, sitting at my desk, [could] wiretap anyone, from you or your accountant, to a federal judge or even the president, if I had a personal email,” he told The Guardian‘s Glenn Greenwald in 2013. Only Sneakers’ imagining that a highly advanced super-device would be required seems far-fetched, in light of Snowden’s disclosures.

That isn’t entirely true, and the NSA’s power over information isn’t absolute. Some things are genuinely encrypted such that prying eyes can’t see them without a key, so far as we know. But one could easily end all of those statements with “yet.” Much of the world’s encryption technology is not based on the creation of a puzzle that is impossible to solve, but rather one that would take a traditional computer a seemingly infinite amount of time to solve. These encryption processes use “trapdoor” mathematical functions. As MIT Technology Review recently put it:

Trapdoor functions are based on the process of multiplication, which is easy to perform in one direction but much harder to do in reverse. For example, it is trivial to multiply two numbers together: 593 times 829 is 491,597. But it is hard to start with the number 491,597 and work out which two prime numbers must be multiplied to produce it.

This is where Sneakers’ super-device starts to become more plausible. Enter: quantum computing. Quantum computers are highly untraditional, probably still theoretical, or at least prohibitively impracticable, computers of the future that I won’t try to pretend I understand. A quantum computer “thinks” in a fundamentally different way than any computer you’ve ever used. But the idea that you might be carrying a future iPhone with a quantum computer onboard seems totally out of the question; a quantum computer requires enormously precise conditions to function. The point is that these very advanced and powerful computers might eat through today’s encryption the way the “black box” in Sneakers did. If one accepts that, as Sneakers suggested and the NSA clearly believes, the information war is the war of the future, the quantum computer starts to look a lot like the atomic bomb used to. But only time will tell how real a quantum computing revolution might be, and quite possibly that will take more time than I have left to look forward to on the planet.

Lasker and Parkes haven’t written a movie since Sneakers, nearly 30 years ago, and it is the only film other than WarGames either is credited with writing, though both developed the Best Picture Oscar-nominated Awakenings in 1990, and are credited as its producers. While Parkes has produced over 50 films, such as Men in Black and Minority Report, and ran DreamWorks for several years, Lasker doesn’t have a screen credit after Sneakers. Despite their divergent paths, the two appeared on a panel at Google in 2008 to discuss WarGames, upon the event of its 25th anniversary. In discussing the genesis of the movie, Lasker explained that he was inspired to write a movie about Stephen Hawking, and the possibility that Hawking might one day discover a unified theory of physics, but he might not be able to communicate the breakthrough because of his debilitating ALS. He and Parkes imagined a scientist like Hawking being paired with a juvenile delinquent, and how the two might understand each other. Both characters exist in the final film, though clearly they evolved a great deal from the original concept. The technology that the actual WarGames is known for was something incidental to Lasker and Parkes; a curiosity they ran across as they attempted to create an environment in which to set the characters they were already envisioning. It was a research trip to Stanford that led them to Peter Schwartz, a futurist who elaborated to them what was then emerging in tech, and appeared alongside them on the panel in 2008. As Parkes put it at Google, to write the other way, to start with the technology as your endpoint and then try to create the characters (which the pair have on other occasions attempted) is a “difficult” and “…often self-defeating process.” Describing the methodology of their research, which led to the technology seen in WarGames, Lasker added, “Let’s find the real story instead of trying to make it up, because it’s usually much more interesting than you can imagine.”

I think that this is the problem with Andrew Niccol’s movies, and what separates them from the work of Lawrence Lasker and Walter F. Parkes. In Gattaca, The Truman Show, and Simone, characters seem to exist to facilitate the story about technology that Andrew Niccol wants to tell. But it’s always fundamentally a story about that technology (more than it is about the characters), and that technology is based on what he can imagine, which, like Lasker said, is never going to as interesting as the reality. Instead, those filmmakers were mainly concerned with the mechanics of their characters and story. As to which technology made it into the script, and how, Parkes said at Google, “All of our research was done in a context—it was the context of where we wanted the story to go, as opposed to saying ‘how does it work?'” This is why characters in WarGames and Sneakers are still vivid decades later. When, in WarGames, David Lightman uses his computer hacking skills to change Jennifer’s (Ally Sheedy) grades, it seems completely real, because it is completely real (now, and when I was a teenager in the 90’s) that a nerdy kid would try to leverage his ability on the internet to impress a girl. When Sneakers’ Martin Bishop embarks on a very risky mission to steal the black box, it’s nevertheless plausible because the reward being offered is a reprieve from the life on the run he’s been living for most of his adult years, and he will soon be (or he already is) an old man. Contrast these very human stories with the unnatural environs of Niccol’s work. In Gattaca, why would a genetically-perfected class need to oppress their unremarkable counterparts in some kind of apartheid state? In The Truman Show, why would a modern day audience watch a 24-hour a day television show that’s tailored to be a facsimile of 1950’s sitcoms? In Simone, why would the entire world fall stupidly in love with a celebrity who is utterly without charm? These three questions are impossible for Niccol to answer, but because his films lack concrete or interesting characters, and are only about the technology, the effectiveness of those movies rests wholly on those questions. But the technology isn’t interesting as a distraction anymore. The movies, today, wholly fail to hold up.

After Simone, Andrew Niccol was not relieved of his command in Hollywood. He’s kept on making competent movies. He followed Simone with Lord of War, which, if The Truman Show introduced Jim Carrey in serious movies, is one of the films that served as notice of Nicolas Cage’s departure from them. The film, a mostly unserious effort from 2005 about an arms dealer that nevertheless attempts to be both tragic and topical in its third act, seems especially based on a pre-9/11 worldview. But it was a modest critical and commercial success, which also describes Niccol’s overall career. 2011’s In Time even grossed $174m on a $40m budget, but its story of a future where people only live to age 25, and time is bought and sold, suffers from many of the same criticisms I’ve outlined against Niccol’s other work. Niccol was quickly sued over In Time by the science fiction writer Harlan Ellison, alleging the idea was purloined from his work. Its main strength was starring Justin Timberlake and Amanda Seyfried, which doesn’t speak highly of Niccol’s craft. Last year he released (via Netflix) Anon, yet another vision of a dystopic future, this one being where people have a continuously recording camera in their eyeballs, and a detective (Clive Owen) must stop a killer who has hacked the technology to cover their identity when murdering. While Anon’s financial viability is known only to Netflix, critics (and users who spoke via RottenTomatoes) didn’t approve, and the film languishes there today with an overall rating of 38%.

I thought Anon was a perfectly competent sci-fi detective story. It wasn’t anything exceptional, although the visuals were very well done. If there was a reason to care about the characters, or what their goals were, it escaped me, but I wasn’t bothered enough to turn off the movie. If history is any predictor, I won’t need to write critically about that fact until some time around 2038. The most glaring flaw the movie had, to me, was that it wasn’t an original idea. This idea had been better executed than Niccol could have hoped to match in a 2011 episode of the British sci-fi anthology series Black Mirror.

Like its prior two failed rebirths, the emerging verdict on Jordan Peele’s The Twilight Zone seems to be a negative one. Even amongst those who like the series, nobody is suggesting it could hold a candle to Rod Serling’s original. If Serling’s five-season masterpiece has a cultural inheritor on present day TV, it is actually Charlie Brooker’s Black Mirror. Black Mirror was created for England’s Channel 4 in 2011, but moved to Netflix in 2015, where it has resided since. This is why the latter entity’s underwriting of Anon seemed especially odd. Black Mirror, in its first season, aired an episode called The Entire History of You, about characters who had eye implants like those in Anon. That episode, however, doesn’t utilize the tech to tell a clumsy detective story with a predictable conclusion. Instead, it’s a cringe-inducing examination of human jealously and the sometimes irresistible urge to learn things we are likely better off not knowing. It’s an extremely human story, and one which most adults will find relatable, even if they are much better than the protagonist at suppressing this particularly self-destructive drive. All of Black Mirror’s stories are about technology, in one way or another, but they are equally about people, and what (their creators imagine) people would do if confronted with those theoretical technologies. Again, Netflix’s numbers (with the rare exception they choose to publicize) are their own, but all indications are that Black Mirror has become extremely popular for the genre. It’s been airing new episodes for nearly a decade, and on RottenTomatoes, its average score from audience and critics hangs above 80%.

There’s no utility to technology without human beings. When it comes to a theoretical technology, we are interested in what it means for our actual lives. We want to know what it will make better—diseases we can cure, lifespans we can extend, planets we can travel to, machines which can protect us, etc. Or worse—all of these same things, from another point of view. Any worthwhile examination of technology necessarily revolves around the human element. While sci-fi has sometimes examined a future earth without people and where the planet is taken care of by robots, this would actually be a pretty pointless technology. Nature doesn’t really make value judgments as between organisms. If human beings were extinct, and these robots were running around preserving the environment, it would just mean human values were being preserved, rather than some objective natural value. You may think the spotted owl is worth preserving, and the spotted owl probably agrees with you, but I assure you that if you are instead a nearby red-tailed hawk who will die if you don’t eat soon, the calculus is very different. Even if the value judgment is as simple as, “I choose living things over non living things,” there’s no possible earth best-suited to living things. Extremophiles are microorganisms that live in the most extreme conditions. Recently, such life has been found thriving in Pitch Lake, the largest natural lake of liquid asphalt in the world. If the worst climate change apocalypse currently imaginable does come to pass, there are some living things that will be the big winners, at the expense of very complicated things like ourselves. Whether that’s better or worse than the world as we know it now is totally within the (human) eye of the beholder. So technology doesn’t have much application beyond our own lives, even if in our worst nightmares (and in a very possible reality) it is the cause of the end of those lives.

Nobody can predict the future. Especially now, when the future is moving so quickly. The best any filmmaker can do is predict that future by accident. But the timeless qualities of people, be those qualities ugly or beautiful, can be examined in art quite well. This is all that even the best movies can hope to achieve. As a teenager, I went to the cineplex to see a bold vision of the future. 20 years later, I go there to see a bolder vision of myself.

Leave a comment