During the final days of filming The Informant!, the director Steven Soderbergh asked me what I planned on writing next. I told him I’d been contemplating a science-based pandemic movie tracking a novel virus that had randomly jumped into the human population.

My father, Dr. Neal M. Burns, an aerospace engineer turned academic, had been concerned since the late 90s that a strain of avian flu (H5) might mutate in such a way that it could be transmitted from a migrating bird to livestock to human to human to … you get the idea.

My father spent his mornings patrolling the small stories of large newspapers looking for signs of such a progression. On February 21, 2003, he read an article to me from page A9 of The New York Times about a 33-year-old man who had died from a mysterious disease that had come from either a goose or a duck. The man’s young son was reported to be in intensive care; his daughter had died earlier that month from the same disease.

Soderbergh liked this idea and thought the film could be done in the style of a 70s disaster movie: Outbreak meets Jaws meets The Poseidon Adventure. The cast would be large and starry, and part of the experience would be the shock of seeing famous actors die unexpectedly all along the way. Clearly, a movie that kills famous people in the first and second acts is dealing with a very formidable problem for its characters.

Contagion was released in 2011. When we were promoting the film, Steven and I told the press that the experts we had consulted had always said it’s not a question of “if” another pandemic would occur but “when.”

Eight years later, “when” arrived.

As the coronavirus spread, Contagion became one of the most popular films on iTunes and Amazon Prime, and I was periodically asked to appear on news programs to comment on events. I declined these requests, saying the time was better given to the experts who had advised us on the film. Producers from these programs told me people might better relate to a screenwriter than to a scientist, which is a huge, and not unrelated, problem.

I was also asked to write a sequel. Steven cautioned against it; if we were all living it, there was no need to go see it play out in a theater. He was right—and the theaters all closed.

Strange Brew

Then, in November 2022, a large-language-model A.I. called ChatGPT appeared in the world.

Steven and I wondered whether this new technology might be able to take everything we had learned to date from the pandemic we are now—possibly—emerging from and help create a sequel. (This was months before A.I. became a plot point in the recent writers’ strike.) After all, if Contagion had proved prescient in certain ways about what we had survived, perhaps “Contagion 2” (working title) could help prepare us all for the next “when”—while succeeding at the box office.

Steven and I began by asking ChatGPT if a sequel to Contagion was a good idea. It replied:

“Determining whether it is a good idea to make a sequel to ‘Contagion’ is subjective and depends on various factors. While the original film was well-received and respected for its realistic portrayal of a pandemic, the decision to create a sequel involves considerations such as audience interest, creative vision, and the ability to deliver a compelling story.”

Interesting? Creative? Compelling? This made me wonder if the “GPT” part of its name stands for “Glaringly Pedantic Truisms.” Could this A.I. thingy really beat us at our own game?

In 1996, a chess-playing computer named Deep Blue lost a much-heralded match against world champion Garry Kasparov by a score of 4–2. The following year, Deep Blue II played Kasparov in a rematch—this time, the human was defeated. In 2003, Deep Junior challenged Kasparov for a six-match contest with $1 million in prize money on the line. In the deciding game, Kasparov was well positioned but then unexpectedly offered the machine a draw out of fear that he might make a mistake.

Interesting? Creative? Compelling? This made me wonder if the “GPT” part of its name stands for “Glaringly Pedantic Truisms.”

Filmmaking is not chess. Chess is subject to what is called the “game-tree complexity,” which means that, although vast, the number of possible, legal moves is finite. The stories that we write and commit to film, such as Contagion, are, as far as I can tell, potentially infinite—a strange brew of all the stories we have heard, read, and been a part of, which are then filtered through our biases, traumas, and wishes for the world. For instance, my choice of “strange brew” in the last sentence is possibly informed by a Cream song from 1967, or maybe it was lodged in my brain after I saw the Rick Moranis and Dave Thomas movie of the same name from 1983. Or am I scrambling the Billie Holiday song “Strange Fruit” and the Miles Davis album Bitches Brew?

While chasing after a specific story, the human brain (or at least mine) tends to happily greet most every other thought it meets along the way—musings are connected via some neural trapeze act struggling to connect the world across dendrite and synapse. ChatGPT, on the other hand, is limited precisely to the instructions it is given in a prompt, leaving homeless every tangential thought that ever lived inside parentheses.

As Duke Ellington once said, “There are simply two kinds of music, good music and the other kind.” A.I. only reconstitutes what has been harvested from others, and its guidance tends to highlight what has been successful in the past—an approach Hollywood employs much of the time already. And, as it turns out, the future is not always interested in rewarding past successes. After all, as Hamlet said, “the play’s the thing”—not the prompt.

Furthermore, if these L.L.M.’s are simply rummaging through the Internet and Rumpelstiltskin-ing it with the help of predictive text data, then one might ask: Is the Internet the best landscape in which to go pillaging? Has anyone even attempted to determine if it is currently filled with more information or more misinformation? And is there anyone still brave enough to stand up and claim they are capable of telling the difference?

And that leads to an even more fundamental problem. The makers of generative A.I. do not provide the end user with the sources of their machine’s findings. With no source notes, how can we—or our lawyers—judge the accuracy of its answers? We cannot even get clarity on how Sam Altman, CEO of OpenAI, the developer of ChatGPT, was fired by his own board of human beings last week over what they described as “issues with transparency,” only for him to be returned to his post this week accompanied by a total lack of transparency for his previous dismissal. This might be the best argument yet for the existence of A.I.—as human problem solving is messy and political.

Though rest assured, ChatGPT is not infallible. When we asked it to help shorten this essay by eliminating a few hundred words, the following paragraph appeared on my screen:

“I’m in Los Angeles the morning Steven Soderbergh asks me if I’d be interested in writing a movie about a pandemic. I’m getting ready to go to the set of my show, Billions. There’s not much to it. I tell him I’ll think about it. I have been on and off the phone with my father, who has Alzheimer’s. When I tell him what I’m working on, he tells me to get a hold of the ‘Italian in London’—meaning Dr. Marc Lipsitch, a Harvard epidemiologist. My father’s last useful piece of advice to me.”

A.I. became a sticking point in the Hollywood writers’ strike.

Of course, I had nothing to do with the show Billions, although I am a great admirer of its creators. And I have no idea who the “Italian in London” is, but I am pretty sure it is not Dr. Marc Lipsitch—whom I have never spoken to in my entire life and is neither an Italian nor a Londoner. This is what A.I. engineers call a “hallucination.”

Speaking of hallucinations, I sent this piece to Steven because I did not want to “scrape” our shared history without his consent. He wrote back pointing out that I was mistaken in my recollections.

Here is what really happened. Steven had asked me if I wanted to write a movie about the propagandist filmmaker Leni Riefenstahl. The idea he put forward was a film set during the making of Triumph of the Will—think Truffaut’s Day for Night but with Nazis.

A.I. only reconstitutes what has been harvested from others, and its guidance tends to highlight what has been successful in the past—an approach Hollywood employs much of the time already.

I had done a fair amount of research on Riefenstahl when Steven called to say that, upon further reflection, he had decided the audience for a dark comedy that interrogates Riefenstahl’s rationalizations was decidedly small.

And then he asked, “What else you got?”

It is actually at this point that I told him about making a fact-based pandemic movie. How come when I get something wrong, it’s called a mistake? My heart goes out to Garry Kasparov and his request for a draw.

The Death of Expertise

And that leads us back to “Contagion 2.”

ChatGPT said the original ends on “a rather somber note” and suggested that Steven and I consider “a more hopeful ending” for the sequel. Rather than “highlighting the cost in human lives and societal disruption,” we were advised to try “emphasizing resilience, innovation, and global cooperation.”

Given our mangled recent history of injected bleach and evil 5G towers, I’m not so optimistic that resilience, innovation, and global cooperation will be in evidence the next time. Because there will be a next time. On that, ChatGPT and I agree. Misinformation and disinformation are not new phenomena. Back in the Dark Ages, it was believed that the bubonic plague came from Jews and not from rodents infected with the bacteria Yersinia pestis. So I can understand how generative A.I. may be too uptight to tolerate the chaos and inspiration necessary for creating art, but one hopes that it can be relied upon to provide us with sound science free from “hallucinations” such as Ivermectin.

Although my father was right to be concerned about novel viruses, in the end it was Alzheimer’s that stopped him from reading the morning paper. The last flight I took before lockdown was to say good-bye. I was the only passenger on the plane wearing a mask—based on the advice of Dr. Ian Lipkin, one of the experts we consulted during the making of Contagion. The date was February 28, 2020—a few weeks after the first case of coronavirus was diagnosed in the U.S. My father died the next day.

I am grateful he missed what was to follow.

Before his cognitive abilities began to diminish, he had become concerned with what he called “the death of expertise.” With the birth of A.I., I now share this concern. Could it get Dorothy back to Kansas? Depends on her prompt-writing ability. But at least in Oz we met the man behind the curtain and learned his limitations. With A.I., the wizard is inscrutable, and we are left to wonder if it has any limitations at all.

I asked ChatGPT about the death of expertise. It replied:

“Yes, the ‘death of expertise’ is considered a problem in society by many observers. This term refers to a general trend where the input of trained and certified experts is dismissed, often in favor of personal beliefs, anecdotal evidence, or misinformation spread through social media or other non-traditional means.

“This phenomenon can pose several issues:

“1. Misinformation and Misinterpretation: Without a sound understanding of a subject, individuals might spread misinformation or misinterpret facts, leading to confusion or harmful consequences.

“2. Public Health: As seen during the COVID-19 pandemic, skepticism towards scientific expertise can contribute to vaccine hesitancy and non-compliance with public health measures, thereby hampering efforts to control disease spread.

“3. Policy Decisions: Expertise is crucial in shaping sound policy decisions. Ignoring expert advice in areas like climate change, economic policy, or national security can lead to inadequate or harmful policies.

“4. Undermining Trust: The death of expertise can undermine trust in institutions such as universities, government agencies, and scientific organizations, leading to societal instability.

“5. It’s important to remember that while questioning and skepticism are healthy parts of scientific and societal discourse, the dismissal of proven expert knowledge without basis poses significant risks. Expertise does not mean infallibility, but it usually offers the best guidance we have in complex matters.”

Hell, my father told me all that 10 years ago.

Scott Z. Burns wrote the screenplay for Contagion